My 2026 Tech Radar: Tools and Skills I'm Exploring This Year

Every year, the list of things I want to learn grows faster than the list of things I actually learn. That's just the nature of being a software engineer, there's always a new framework, a new tool, a new paradigm demanding your attention. The trick is being intentional about what you pick up and what you let go.
Last year I discovered the Thoughtworks Technology Radar through The Pragmatic Engineer podcast, and it changed how I think about this. The Radar organizes technologies into rings, Adopt, Trial, Assess, Hold, based on where they sit in maturity and usefulness. It's a clear way to frame your own exploration: what are you going all-in on, what are you experimenting with, and what are you just keeping an eye on.
So here's my personal tech radar for 2026. Not a rigid plan, more of a map of where my curiosity is pulling me this year. A little late to the year, I know, but here we are.
AI-Powered Development (Trial & Adopt)
This is the big one. AI tooling went from "interesting experiment" to "daily driver" for every developer in the last year.
I've been using Claude Code for my personal projects, and it's proven to be genuinely powerful. But I've learned a few things the hard way. If you spend the time on good prompting and planning your tasks and specifications, Claude Code is really effective. However, the bigger or longer the tasks you give it, the more likely it is to overdo things, get sidetracked, or let code quality slip. My current strategy is to write detailed tasks in a plan document and direct the agent to tackle one task at a time. That gives me more control over what it does, and I can review code changes more frequently.
Beyond Claude Code, there's a whole ecosystem I want to dig into this year:
-
MCP (Model Context Protocol): A protocol that lets AI tools connect to external data sources and services. Docker's MCP feature is particularly interesting for containerized workflows.
-
RAG & LangGraph: Building retrieval-augmented generation pipelines and agent workflows. This is where AI tooling gets really practical for production use cases. I've been putting these on the backlog for way too long, so this year I'm committing to actually building a project with RAG and LangGraph. Not just reading about it.
-
shadcn/ui & v0: Using AI to scaffold and generate UI components. The v0 approach of describing what you want and getting functional React components back is compelling.
-
Lovable: Another AI-powered app builder that I want to give a go. I've heard really good things about from colleagues of mine.
-
LibreChat: A self-hosted AI chat with MCP server support. My plan is to pair it with the custom MCP servers and RAG pipelines I'll be building, so I can use LibreChat as a single interface to quickly query and access my own data.
The common thread here is that the most impactful AI tools are built by engineers who understand the strengths and limitations of LLMs and design great software around them. That's where I want to be, not just using AI, but understanding how to build with it effectively.
Data Engineering (Trial)
Working on a large-scale ETL platform last year gave me a taste for data engineering, and I want to go deeper on established data tools.
-
Apache Spark: The industry standard for large-scale data processing. Understanding Spark properly opens doors to serious data pipeline work.
-
ClickHouse: A columnar database built for analytics. Fast, open-source, and increasingly popular for real-time analytics workloads. I used ClickHouse at a previous role, but only at the surface level.
-
data-contract-cli: This one is about data governance. After seeing firsthand how early data governance sets a project up for success, I want to explore tooling that enforces data contracts between producers and consumers.
Data engineering is one of those areas where the fundamentals pay off everywhere. Even if I stay primarily in application development, understanding how data flows through systems at scale makes me a better engineer.
Web Development (Trial & Keep)
- Fastify: I've been curious about this Node.js framework for a while. I might try to vibe code (responsibly) some of my side projects with Fastify.
- Rails: I keep coming back to Ruby on Rails. There's something about the developer experience and the convention-over-configuration philosophy that is satisfying to code on.
- Spring: My day job runs on Java and Spring. Continuing to sharpen this is a must.
- Meilisearch: A search engine that's developer-friendly and easy to self-host. I actually used Meilisearch at a previous role and had great experience indexing data for quick searches in a healthcare services public repository. It handled that use case beautifully.
Mobile Development (Maybe)
This is more of a "get back to it" category. I've let mobile development slide, and I want to change that.
- React Native and Expo: The pragmatic choice. I already know React and worked with React native and Expo a while back, so the ramp-up cost is low, and the ability to ship to both platforms from one codebase is hard to argue with.
Desktop Development (Maybe)
- Tauri: A Rust-based alternative to Electron for building desktop and mobile apps. Lighter, faster, and it doesn't ship a full Chromium instance with every app. I'm curious to see how mature this is in 2026.
Infrastructure & Platform Engineering (Trial)
I've been adjacent to DevOps work for a while, but I want to get more hands-on with the infrastructure side.
- Kubernetes: Time to get back into K8s. Running containerized workloads at scale is a core skill I want to solidify, especially as I think about deploying AI services and data pipelines.
Platform engineering as a discipline is becoming increasingly important, and I think understanding this layer well makes you a more complete developer. You don't have to become a full-time DevOps engineer, but knowing how your code runs in production, really knowing, changes how you write it.
Automation (Adopt)
- n8n: I've been running n8n in my home lab for automation workflows, and I want to keep expanding what it does. Home automation, data pipelines, notification systems, n8n is the glue that ties my self-hosted setup together. Combined with MCP servers and local AI models, there's a lot of potential here.
Cybersecurity (Trial)
This one might seem like an outlier. Cybersecurity, specifically ethical hacking, was one of my favorite topics in university. With AI making it easier than ever to learn and experiment, it feels like the right time to revisit it.
- Kali Linux: The standard toolkit for penetration testing and security research.
- AI + Security: How AI is changing both the attack and defense landscape. Understanding this intersection is becoming essential.
Understanding how systems get compromised makes you better at building systems that don't. And honestly, it's just fun.
Closing Thoughts
Looking at this list, it's ambitious. Probably too ambitious. But that's fine, this isn't a checklist I need to complete by December. It's a map of what interests me and where I think the most valuable learning is right now.
The real strategy is the same one that worked last year: pick something, go deep enough to be useful, then move on. Not every tool on this list will earn a permanent spot in my workflow. That's the point.
Not everything needs to be adopted. Sometimes it's enough to trial something, assess it, or just know it exists. What matters is staying curious and being intentional about where you spend your time.
Here's to a year of building, breaking things, and learning constantly.