Category: Knowledge Generation · Execution Model: Probabilistic (LLM), Validated, Chunked
The Deep Research Agent transforms raw demand signals into launch-ready, economically validated experiential concepts. It performs large-context semantic reasoning over recent market signals, applies strict business and margin constraints, and emits structured, schema-validated JSON chunks that represent concrete, executable session candidates. This agent is the system’s primary bridge between abstract demand detection and concrete experience creation.
object
{
"message": "string",
"chunksForwarded": "number",
"responses": "Array<{ chunkIndex: number, status: number }>"
}
Reads from IngestedData collection
Invokes external LLM provider
Writes sanitized research output to Whook ingestion pipeline
If no ingested data is available, the agent aborts with a 400 error. If the LLM fails, execution terminates with a 500 error. Invalid or malformed JSON chunks are skipped but do not abort the run. Only sanitized, schema-valid chunks are forwarded downstream.
{
"dependencies": [
"IngestedData",
"LLM Adapter",
"Sanitizer Service"
],
"constraints": [
"Strict JSON-only output",
"Maximum 5 items per chunk",
"Margin validation enforced before ingestion",
"Lowest viable credit tier always selected"
]
}
curl -X POST https://api.whook.ai/v1/agent/research/scan \
-H "Content-Type: application/json" \
-d '{
"resourceSignals": {
"cashAvailableForPilot": 6000,
"targetUtilizationPercent": 0.7,
"coachHoursAvailable": 60,
"siteManagerHoursAvailable": 40,
"maxConcurrentPilots": 6
}
}'