ARTIST EVALUATION
FRAMEWORK

A unified intelligence model for signing decisions. Three lenses, corroborated signals, one clear picture.

Campaign Operations March 2026 Strategic Analytics + A&R + Commercial

Right now, when A&R evaluates an unsigned artist, the process relies heavily on a single signal: a chart going up and to the right. The room reacts to the slope of the line, not to what the line actually represents.

An artist doing 1M streams consistently across 3 releases with flat growth has proven they have a real, durable audience. An artist growing 57% from 118 to 186 streams has a great-looking chart and almost no audience. Both get presented the same way.

The result: the label is using surface-level data to make high-stakes investment decisions. Volume, consistency, engagement depth, scene context, and management quality all matter — but there's no system that brings them together into a unified picture.

No single data source gives you the full picture. Good intelligence comes from corroborating multiple signals to paint a story — then using that story to make a decision.

This framework evaluates unsigned artists through three lenses. Each captures a different dimension. Together, they produce a picture that's more complete than any single department can generate alone.

📊

Lens 1: Scale

How big is this artist right now?
  • Monthly listeners
  • Total catalog streams
  • Social follower base
  • Live ticket capacity
"Volume is context. Before you look at any trajectory, anchor: how big is this artist?"
📈

Lens 2: Trajectory

Where are they going?
  • Growth rate across windows
  • Consistency across releases
  • Listener retention rate
  • Organic vs. artificial growth
"A spike is noise until proven otherwise. What makes it real? Multiple signals across multiple releases."
🔍

Lens 3: Intelligence

What else do we know?
  • Network / collaborators
  • Management situation
  • Live performance ability
  • Scene / cultural positioning
"Data can tell you if it's working. Only human intelligence can tell you why — and whether it will keep working."
Campaign Ops sits at the intersection of these three lenses. Strategic Analytics owns the data. A&R owns the taste. Commercial owns the market context. This function translates between them and ensures no signal gets lost.

Every evaluation starts with the same question: What is the business decision you need to make, and what confidence level does that decision require?

For unsigned artists, there are three tiers of decision. Each requires progressively more intelligence:

Decision Data Required Confidence Who Inputs
1. Pursue or pass? Scale + basic trajectory (Lens 1–2 surface data) Directional — quick read Strategic Analytics
2. How aggressively? Full trajectory + engagement depth + scene context Corroborated — multiple signals aligned Analytics + A&R + Commercial
3. What's the offer? All three lenses + management intel + deal comps High confidence — unified picture Analytics + A&R + Business & Legal
The critical insight: Decision 1 can be made with relatively thin data. Decision 3 requires thick, corroborated intelligence from multiple sources. Right now, the label frequently uses Decision 1-level data to make Decision 3-level commitments.

Before evaluating trajectory, you have to anchor scale. How big is this artist right now? Growth percentages are meaningless without context — 50% growth at 500K monthly listeners is a fundamentally different signal than 50% growth at 5K.

Preliminary Volume Tiers

To be validated with Strategic Analytics

Monthly Listeners Tier What It Tells You
Under 50K Embryonic Very early. Audience exists but hasn't been tested across releases. High variance.
50K – 250K Developing Has an audience. Enough data to start evaluating consistency and retention patterns.
250K – 500K Emerging Something real is happening. Multiple signals should be corroborating by now.
500K – 1M Established Unsigned Proven they can build. The question shifts from "is it real?" to "why haven't they been signed?"
1M+ Rare Unsigned Significant pre-existing audience. Either a massive opportunity or a red flag worth investigating.
Key principle: "Big and steady" is a strong signal. An artist with sustained volume across multiple releases has proven audience durability — even with flat growth. The default expectation for any spike should be reversion to the mean until multiple signals prove otherwise.

Given a large enough sample, performance reverts to the average. When an unsigned artist has a sudden spike, the statistically correct default assumption is that they will come back down. The spike is noise until proven otherwise.

What Proves a Spike Is Real

  • Sustained growth across multiple releases
  • Engagement metrics that hold post-spike
  • Geographic spread that widens, not concentrates
  • Listener retention between releases stays above baseline

What Suggests a Spike Is Noise

  • Single-song virality without catalog lift
  • Playlist-driven discovery without organic follow-through
  • Social growth that doesn't convert to streams
  • Short track record with no prior consistency data

The evaluation framework should explicitly flag which category an artist's recent performance falls into — not to discourage pursuit, but to inform the confidence level and structure of the offer.

Data tells you if it's working. Human intelligence tells you why — and whether it will keep working. The qualitative layer isn't a replacement for analytics. It's the context that turns numbers into a decision.

Network & Collaborators

Who has the artist worked with? What producers, writers, or featured artists are in their orbit? Relationships predict trajectory. An artist who writes for established acts or co-produces with in-demand collaborators carries de-risked creative credibility.

Management Situation

Is the artist professionally managed? By whom? Management track record is one of the strongest predictors of post-signing execution.

Live Performance

Can they perform? Have they sold tickets? Live viability is a massive de-risker and increasingly central to long-term revenue.

Cultural Positioning

Are they part of an ascending scene? Are they the tip of a spear or a late arrival? Scene momentum is a rising tide — but it recedes.

Work Ethic & Partnership Potential

Will they be collaborative or difficult? Track record of reliability matters for deal structure and team commitment.

This is where A&R's expertise plugs in. Campaign Ops doesn't generate qualitative intelligence — it creates the structured space for A&R to contribute it alongside the quantitative picture, so that both inform the decision.

This framework defines the thinking. The model is built through structured input from the stakeholders who own each lens:

1
Define the Criteria
Each key stakeholder (Strategic Analytics, A&R, Commercial) independently answers: "If you could only look at 5 data points about an unsigned artist before making a signing decision, what would they be?"
2
Find the Overlap
The responses will share 3 common priorities. The remaining 2 per stakeholder are where the interesting debate lives.
3
Force the 5
Stakeholders negotiate to a shared set of 5 evaluation criteria. That debate is the process — it surfaces what each leader actually values and builds shared ownership.
4
Operationalize
Campaign Ops translates the consensus into a repeatable evaluation template that can be populated from existing data sources and A&R input.
5
Test & Iterate
Run the model against recent signings retrospectively. Did the framework predict which deals worked and which didn't? Refine.
Critical: This is collaborative and time-bound. Stakeholders shape the criteria during the build phase. Once locked, the framework runs. If you don't speak up during the build, you don't get to redesign it after launch.

Campaign Ops does not own the data. It does not own the taste. It does not own the market read. What it owns is the integration layer — the function that ensures every signing decision is informed by a unified picture rather than a single signal.

Campaign Ops Does
Campaign Ops Does Not
Translate data into decision-ready intelligence
Generate the underlying data
Structure the evaluation so all signals are visible
Override A&R's creative judgment
Surface patterns across the pipeline
Approve or deny signings
Ensure consistency in how artists are assessed
Replace the expertise of any department
Flag when signals contradict each other
Make the final call

This framework is the architecture. The model is built through the process described above, with input from the people who own each piece. Campaign Ops owns the integration — the assurance that when this organization makes a signing decision, it's making it with the fullest possible picture.