Right now, when A&R evaluates an unsigned artist, the process relies heavily on a single signal: a chart going up and to the right. The room reacts to the slope of the line, not to what the line actually represents.
The result: the label is using surface-level data to make high-stakes investment decisions. Volume, consistency, engagement depth, scene context, and management quality all matter — but there's no system that brings them together into a unified picture.
No single data source gives you the full picture. Good intelligence comes from corroborating multiple signals to paint a story — then using that story to make a decision.
This framework evaluates unsigned artists through three lenses. Each captures a different dimension. Together, they produce a picture that's more complete than any single department can generate alone.
Lens 1: Scale
- Monthly listeners
- Total catalog streams
- Social follower base
- Live ticket capacity
Lens 2: Trajectory
- Growth rate across windows
- Consistency across releases
- Listener retention rate
- Organic vs. artificial growth
Lens 3: Intelligence
- Network / collaborators
- Management situation
- Live performance ability
- Scene / cultural positioning
Every evaluation starts with the same question: What is the business decision you need to make, and what confidence level does that decision require?
For unsigned artists, there are three tiers of decision. Each requires progressively more intelligence:
| Decision | Data Required | Confidence | Who Inputs |
|---|---|---|---|
| 1. Pursue or pass? | Scale + basic trajectory (Lens 1–2 surface data) | Directional — quick read | Strategic Analytics |
| 2. How aggressively? | Full trajectory + engagement depth + scene context | Corroborated — multiple signals aligned | Analytics + A&R + Commercial |
| 3. What's the offer? | All three lenses + management intel + deal comps | High confidence — unified picture | Analytics + A&R + Business & Legal |
Before evaluating trajectory, you have to anchor scale. How big is this artist right now? Growth percentages are meaningless without context — 50% growth at 500K monthly listeners is a fundamentally different signal than 50% growth at 5K.
Preliminary Volume Tiers
To be validated with Strategic Analytics
| Monthly Listeners | Tier | What It Tells You |
|---|---|---|
| Under 50K | Embryonic | Very early. Audience exists but hasn't been tested across releases. High variance. |
| 50K – 250K | Developing | Has an audience. Enough data to start evaluating consistency and retention patterns. |
| 250K – 500K | Emerging | Something real is happening. Multiple signals should be corroborating by now. |
| 500K – 1M | Established Unsigned | Proven they can build. The question shifts from "is it real?" to "why haven't they been signed?" |
| 1M+ | Rare Unsigned | Significant pre-existing audience. Either a massive opportunity or a red flag worth investigating. |
Given a large enough sample, performance reverts to the average. When an unsigned artist has a sudden spike, the statistically correct default assumption is that they will come back down. The spike is noise until proven otherwise.
What Proves a Spike Is Real
- Sustained growth across multiple releases
- Engagement metrics that hold post-spike
- Geographic spread that widens, not concentrates
- Listener retention between releases stays above baseline
What Suggests a Spike Is Noise
- Single-song virality without catalog lift
- Playlist-driven discovery without organic follow-through
- Social growth that doesn't convert to streams
- Short track record with no prior consistency data
The evaluation framework should explicitly flag which category an artist's recent performance falls into — not to discourage pursuit, but to inform the confidence level and structure of the offer.
Data tells you if it's working. Human intelligence tells you why — and whether it will keep working. The qualitative layer isn't a replacement for analytics. It's the context that turns numbers into a decision.
Network & Collaborators
Who has the artist worked with? What producers, writers, or featured artists are in their orbit? Relationships predict trajectory. An artist who writes for established acts or co-produces with in-demand collaborators carries de-risked creative credibility.
Management Situation
Is the artist professionally managed? By whom? Management track record is one of the strongest predictors of post-signing execution.
Live Performance
Can they perform? Have they sold tickets? Live viability is a massive de-risker and increasingly central to long-term revenue.
Cultural Positioning
Are they part of an ascending scene? Are they the tip of a spear or a late arrival? Scene momentum is a rising tide — but it recedes.
Work Ethic & Partnership Potential
Will they be collaborative or difficult? Track record of reliability matters for deal structure and team commitment.
This framework defines the thinking. The model is built through structured input from the stakeholders who own each lens:
Campaign Ops does not own the data. It does not own the taste. It does not own the market read. What it owns is the integration layer — the function that ensures every signing decision is informed by a unified picture rather than a single signal.
This framework is the architecture. The model is built through the process described above, with input from the people who own each piece. Campaign Ops owns the integration — the assurance that when this organization makes a signing decision, it's making it with the fullest possible picture.