IT business analysis

IT Business Analysis Is… Analysis

Business analysis
In my previous post, I started reflecting on some of the key underrated factors in IT business analysis — namely, the technical foundation (an IT background). But there’s a second, no less critical factor for an IT business analyst that people tend to overlook just as often: analysis itself — or more precisely, the inclination toward and ability to perform it.

For me (and, I believe, for many others even more so), analysis has always felt like something vague. You think you understand what it is, but when you try to explain it clearly to someone else, you hit a wall and end up relying on buzzwords like “systems thinking,” “analytical thinking,” “logical thinking,” and so on. And trying to properly evaluate analytical skills in an interview? That borders on fortune-telling. I’m interested in exploring this topic in a free-form, reflective way— so this post will be more about thoughts than any actionable advice. I won’t be referencing any hefty sources or diving deeply into theory — because I don’t claim to have that level of expertise. As mentioned, I’ll be working from the understanding I’ve developed through years of practice.

Let’s start with the obvious: what is analysis, anyway? Without overcomplicating things, here’s a snippet: analysis (“breaking down, separating, disassembling”) is a method of investigation characterized by identifying and studying individual parts of a subject. We’ve all heard of financial analysts and other types, so we label ourselves business analysts — while often conveniently ignoring the very core: analysis itself.

Here’s how I understand the essence of an analyst, without tying it to any specific field: an analyst is someone who is exceptionally good at working with information. They can sift through mountains of data and produce something meaningful and useful from it.

As an illustrative example, I love the idea of Sherlock Holmes as the archetype of an analyst. More specifically, the modern Sherlock played by Benedict "what’s-his-name". Remember those scenes where Sherlock looks at something and observations flash across the screen? Each of those is a nugget of information he picks up through keen observation. Then, in mere seconds (unlike our long, painful thought processes), he begins forming causal and other types of relationships between these bits, filters out noise, identifies trends, and draws conclusions. Incidentally, if you ever played any of the Sherlock-themed video games, they visualize this beautifully: you have to draw connections between facts and assumptions yourself to ultimately arrive at a conclusion. In short, Sherlock is the ideal analyst — unattainable, of course, but still aspirational.

Another example is less vivid but stuck with me nonetheless. In Dan Brown’s once-popular novel Digital Fortress, one of the main characters is a woman who works as an analyst for the NSA. I don’t remember every detail, but the general idea was that her job — or that of one of her colleagues — involved processing massive amounts of raw data from various sources (like a human supercomputer), filtering out what was relevant to her superiors, and regularly delivering concise, actionable summaries. At the time, this was a kind of revelation for me: that’s basically what analysts do. The result could take many forms — conveying the processed data to others in a goal-oriented format, identifying patterns, formulating and justifying conclusions, etc. So, an analyst is essentially a specialist in working with information.

If I were to design a knowledge framework for business analysis from scratch, I think I’d put business analysis information at the very center (that's what it's called in the BABOK Guide). Maybe I didn’t study BABOK closely enough and it already does, but I’d definitely emphasize information more strongly in the context of BA tasks. Not goals, because we already know the goal is to bring value or “happiness” to organizations in one form or another — but rather tasks, meaning the steps we take to deliver that value. As described above, the essence of any analyst lies in how they work with information — and business analysts are no exception. Every area and phase of BA work revolves around interacting with information: diving into new domains, collecting countless volumes of data from stakeholders, documents, and other sources; analyzing that data; drawing conclusions (needs, solutions, requirements, etc.); clearly and thoughtfully communicating insights to others — and so on. Everything else — communications, planning, management, and all those smart-sounding, useful words — exists to support this work with information. So really, aren’t we all just Sherlock Holmes? With the exception of the lack of emotionally intelligent communication skills.
So, an analyst is someone who knows how to work with information — really work with it. They know how to gather or extract it well, analyze it properly (surprise), classify it meaningfully, and operate with both facts and assumptions at any stage. They spot patterns, detect trends, link chunks of information together, generate relevant conclusions based on the context — and finally, communicate it all to whoever needs to know (and, you guessed it, do so effectively). Sounds about right? Now let’s pause and ask ourselves: how often do we actually check, evaluate, or even train these skills?

Let me outline what I see as the typical modern BA interview flow:

  1. What books on business analysis have you read? What are “requirements” and what types are there? What does INVEST stand for? Are business rules requirements?
  2. What’s Scrum? What are the roles? How many pages in the latest guide? And why is it now uncool to say "grooming"?
  3. Show us your specs examples.
  4. And if it’s a joint interview, then: “Where do you see yourself in five years?” and other psychomagic from HR.
I’m not saying this checklist is flawed. But when followed mechanically (without attempting to evaluate other, less obvious, yet equally important qualities), I’m convinced it won’t help you spot a truly high-potential analyst. So yes, you get some knowledge check and a side of HR mysticism. Here’s what I’d suggest adding — and what I’ve been doing myself (with varying success, naturally):

  • IT background — either directly or indirectly, noticing what terms the person uses, whether they understand what they’re talking about, and how consciously they apply those terms.
  • Soft skills — especially analysis and everything related to it: structured/analytical thinking and information handling. Plus all the things that enable this: attention to detail, ability to learn, intelligence, communication skills, etc.

If we rebuild the interview flow with this in mind, the initial theoretical parts can be trimmed way down. Because if we’re not assessing hands-on experience but rather knowledge of theory (e.g., with juniors), then a person with strong information-handling skills:

  1. Will pick up what they need pretty quickly — they can digest vast amounts of material in a new domain and start analyzing and structuring it. P.S. By “vast” I mean books. Remember those? And if your memory cells are calibrated for tweets, and your idea of learning is Instagram posts with emojis or 15-minute YouTube explainers — you’re probably not in the right place (dead serious).
  2. Will have a black belt in Googling (which is information work) and won’t break a sweat over something like “how many events are in Scrum” (honestly, do you really need to store that in your brain forever?).
  3. Will carry the day thanks to solid problem-solving skills — which, guess what, are deeply intertwined with analytical thinking.

Now, I’m not going to pretend I’ve got a scoring system ready to measure all this. This isn’t a step-by-step self-help guide or a hiring checklist. What I can do is share a few thoughts on the traits that cluster around analysis.

Systems thinking. This is the ability to look at anything — a thing, a process, a phenomenon — and see it as a system, with components and interconnections. An engine is a system of valves and other parts. A car is a system of systems — one of which is the engine. A highway is a system of infrastructure, vehicles, and pedestrians. So, basically, systems thinking is our ability to mentally break things down into parts, spot relationships, and derive the properties of the whole from those of its components.

Analytical thinking. Closely tied to the above. This is how well we perform both analysis and synthesis: the ability to dive into detail and zoom back out, to shift abstraction levels, to move between a bird’s-eye view and deep-dive mode with ease.

There’s even a theory that analytical thinking is a cognitive style — and that it competes with intuitive thinking. You’ll see this reflected in the MBTI framework in two of the four scales:
  • Sensing vs. Intuition: Do you prefer tangible facts, or gut-feel and patterns?
  • Thinking vs. Feeling: Do you make decisions based on logic, or on emotion?
Now imagine that you, organically, sit on the “wrong” side of both scales. That leaves just 4 out of 16 MBTI types who are naturally wired for solid analytical work. If we assume even distribution, we’re down to 25% of the population. Yes, I know — real people aren’t that one-dimensional, and no one is 100% type A or type B. But still, the best analysts will usually have a strong tilt toward these specific preferences. Of course, this is an oversimplification. But it does illustrate how important these traits are when we evaluate people for analytical roles. And how much more useful they are than knowing the textbook difference between verification and validation. By the way, I do think these traits can be developed over time — through conscious effort and consistent practice. It’s a habit: thinking clearly, logically, and systematically. It’s about choosing to make decisions based on logic and facts — not vibes or mental tarot cards (which, ironically, is a kind of subconscious analysis). It helps to do logic puzzles. Or play well-designed quest games. Or build systems for managing your life (like GTD). There’s a whole internet full of tools — I just want to highlight that this matters.

Problem-solving skills. This is the ability to not just face unusual situations, but handle them effectively. It ties into systems thinking because a solid systems approach is often half the solution. People fall somewhere on a scale — from “total shutdown in the face of the unexpected” to “I’ve got 20 action plans ready before the first sign of trouble even finishes loading.” If you can look at a messy, ambiguous situation, analyze it, generate options based on current constraints, assess their viability, and create a plan — you're getting close to that ideal.

Here’s the kicker: practically every BA technique is a tool designed to facilitate analysis. Which means they’re useful not just for work, but for life in general — and practicing them reinforces the core skills we’re talking about. SWOT. Ishikawa diagrams. Brainstorming. SMART. Mind mapping. Each one teaches you to approach chaos systematically. It’s a virtuous loop: practice strengthens the muscle; the stronger the muscle, the better your outcomes.

In my previous post, I shared signs that a BA lacks IT knowledge. Now let me give you some common red flags I’ve seen related to weak analytical skills — mainly from my training experience, but they show up in real projects too:
  • Struggling with new theory. Even if BA concepts are presented clearly, often, people can’t connect the dots, retain details, or form a cohesive picture. If someone can’t absorb and structure new information effectively, they’ll flounder with the information firehose that is real-world analysis work.
  • Sloppy attention to detail. You’ve run ten discovery sessions, gathered tons of input. Ideal outcome: every word is accounted for or marked for follow-up. What happens instead? People forget, ignore, overlook. That’s a recipe for catastrophic quality issues.
  • Inability to see the system. Let’s say you need to decompose a solution scope. This requires splitting it into coherent, logical chunks. Many struggle with this — producing uneven coverage, mismatched abstraction levels, or mixing user actions with system features.
  • Contradictory requirements. For example, writing a spec where sections live in isolation: data definitions go one way, UI goes another, quality attributes live their own lives. To spot the connections (like how a quality requirement implies a new feature), you need to zoom out and see the whole picture — from multiple angles and heights.

I could list more. These gaps hurt. And they trace back to one root cause: lack of applied analytical thinking. Yes, we’re also facilitators, translators, and bridge-builders. But first and foremost — we are analysts. Let’s not lose the very essence of the role. Because if you remove both the “IT” and the “analysis” from IT Business Analysis… All that’s left is someone who looks nice and talks well in front of a client. Sometimes that’s enough. But I hope, for your sake and your team’s, that you’re aiming for more.
Made on
Tilda