Engineering · April 30, 2026 · 7 min read

Hunting Now: how a search bug became a public demand queue

Chef tested with a brand-new account, searched "ebk jaaybo", got zero results. The next two scrapes ignored it. Tracing why turned into a P0 fix, then four feature rounds, then a public page that recruits visitors into searching the artists they want — turning the gap between "what users want" and "what we have" into the most active surface on the site.

The bug

We have a missing_artist_requests table. The iOS Find My Sound flow is supposed to log a row whenever a user searches an artist we don't have, with a priority that bumps every repeat search. The scraper's queue function pulls top- priority rows first, runs three query variants per artist on YouTube, and lands beats in the catalog. End-to-end loop: search → demand row → next scrape → catalog grows.

Chef's test broke the loop. EBK Jaaybo is a Stockton drill artist not in our master list. He searched. Nothing in the catalog. The next scrape passed the row over. Then the one after. Three different bugs were stacked on top of each other.

Bug 1: the search rescue path

The search code's demand-logging gate read:

const lowResults = beats.length < 10;
let demandArtistName = detectedArtist;
if (lowResults && !demandArtistName) {
  demandArtistName = extractArtistFromQuery(query);
}
const buildingYourSound = lowResults && !!demandArtistName;
if (buildingYourSound && demandArtistName) {
  supabase.rpc('log_missing_artist_request', { ... });
}

Looks fine. But there's a multi-word fallback before this block: when an artist isn't found in the master list, the code runs search_artist_beats_v2 against each significant word in the query. So "ebk jaaybo type beats" became three separate searches: "ebk", "jaaybo", "beats". The "ebk" search returned 12+ beats with "ebk" anywhere in the description — random producers, unrelated tracks. Result: lowResults = false. The log never fired. The user saw irrelevant beats and assumed the platform didn't have what they wanted.

Fix: log demand whenever the user typed an explicit unknown artist (extracted but not in the master list), regardless of result count. The substring rescue can fool the result counter, but it can't fool the "did the user type an artist name we don't have" check.

Bug 2: rows stuck in 'queued'

Even when the row landed, the scraper's queue function had a classic dangling-state bug. queue_missing_requests flipped rows to status='queued' before running their search queries — defensive, in case a crash mid-loop caused duplicate work. But there was no mark_scraped() step at end of phase. (Actually: the function existed, it just wasn't called anywhere.) So:

Fix: at end of phase 1, if quota wasn't dead, mark all queued rows as 'scraped'. They've done their job.

Bug 3: stranded rows

Bug 2's fix only handles the happy path. If the YouTube quota runs out partway through phase 1, or the scraper crashes, or the run gets killed mid-job, the rows that were flipped to 'queued' never get marked 'scraped' AND never get retried. Stuck forever until someone runs SQL by hand.

Fix: at the start of queue_missing_requests, recycle any row in 'queued' with updated_at older than 6 hours back to 'pending'. The 6-hour window is comfortably longer than any single scrape run (under 30 minutes) so it only catches genuine orphans. After 6 hours, a stuck row reappears in the queue automatically.

Three bugs stacked. Each one alone would have been a 1% reliability hit. Together they made the demand loop look broken to anyone whose first search wasn't in the master list.

From bug to feature

The P0 fix shipped. EBK Jaaybo (and everyone else who hit the same gap) would now land in the queue, get scraped, get a row in the catalog. Loop closed. But the bug surfaced something bigger: the demand queue was a hidden treasure. Users were searching specific artists and that data was sitting in a Supabase table only Chef could see.

Four feature rounds came out of that observation:

The recruitment loop

3
Stacked bugs
5
Feature rounds
12h
Scrape cadence
6h
Recycle window

The interesting bit isn't the bug fix. It's that exposing the demand queue publicly creates a flywheel:

  1. User searches an artist we don't have.
  2. Search lands in queue at priority 1, scraper picks it up within 12 hours.
  3. That same row appears on /hunting-now immediately (it's a public RPC, no caching).
  4. Someone shares the URL. The OG card shows "EBK Jaaybo · 1 search · IN QUEUE."
  5. A viewer thinks "I want him too" and searches him in the app. Priority +1.
  6. Higher priority means the scraper runs his queries with more pages. More beats land. Catalog grows in the direction of demand.

Without the OG card, /hunting-now is just a status page. With the OG card, every share is a recruitment ad for the demand queue itself. The loop closes when the artist lands in the catalog and the row gets scraped'd.

What this isn't

Hunting Now isn't a leaderboard. It's a queue. The top of the list isn't "best artist" — it's "highest priority for the next scrape." Once an artist gets enough beats in the catalog, they drop off (status flips to scraped). The page always reflects what's missing, not what's most popular. That's deliberate: trending lives at /trending-now, demand lives at /hunting-now.

It's also not a social network. There's no "follow this search" or "notify me when found" button yet. Those exist partway in the schema (the missing_artist_requests table has a nullable user_id column from when the RPC was first written), but no notification path on top of them. That's the next round.

See what artists are in the queue right now

Live demand. Updated continuously. Search any artist to add your vote.

Open Hunting Now →

What's next

Push notifications. When a queued artist actually lands in the catalog (rows go from queuedscraped and the scraper's save_beat succeeds), notify every user whose user_id was attached to that demand row. Closes the loop fully: search → queue → public surface → catalog → push notification → playback. Probably the next ~50 lines on the scraper plus a new push handler.

© 2026 StudioMode · Bug to feature in five rounds.