Web Scraping for Cryptocurrency & DeFi: How AI Agents Track Prices, On-Chain Data, Sentiment & Market Intelligence in 2026
The global cryptocurrency market surpassed $3.5 trillion in total market cap in 2026, with daily trading volume exceeding $150 billion across 600+ exchanges. DeFi protocols hold over $200 billion in total value locked (TVL). And every day, new tokens launch, liquidity pools shift, whale wallets move billions, and market sentiment can flip in minutes.
Yet professional crypto data is expensive and fragmented. Chainalysis charges $100K-$500K/year for institutional analytics. Nansen commands $3K-$10K/month for on-chain intelligence. Messari Pro costs $2K-$5K/month. Kaiko's market data starts at $5K/month. All while most of this information โ prices, TVL, token listings, social sentiment, whale transactions โ is published across public blockchains, aggregator sites, and social platforms.
In this guide, you'll build an AI-powered crypto intelligence system that scrapes exchange prices and order books, tracks DeFi protocol metrics, monitors whale wallets, aggregates social sentiment from Crypto Twitter and Reddit, and uses GPT-4o to identify alpha opportunities before the crowd.
Why AI Agents Are Transforming Crypto Intelligence
Crypto markets have characteristics that make them uniquely suited for AI agent automation:
- 24/7/365 markets โ Unlike traditional finance, crypto never sleeps. You need monitoring that doesn't sleep either.
- Extreme fragmentation โ Prices vary across 600+ exchanges, thousands of DEX pools, and dozens of blockchains. No single source has everything.
- Information speed = alpha โ Whale movements, exchange listings, protocol exploits, and regulatory announcements move markets within seconds. The first to know wins.
- Public blockchains โ Unlike opaque traditional markets, blockchain data is inherently public and scrapeable. The challenge is aggregating and interpreting it.
- Social-driven markets โ Crypto Twitter, Reddit, Telegram groups, and Discord servers drive more price action than fundamentals. Sentiment scraping is essential.
Step 1: Multi-Exchange Price & Order Book Scraping
The foundation of any crypto intelligence system is comprehensive price data across centralized exchanges (CEXs) and decentralized exchanges (DEXs).
Architecture Overview
Your agent needs to scrape prices from multiple exchange frontends, aggregator sites (CoinGecko, CoinMarketCap, DeFiLlama), and DEX interfaces (Uniswap, Raydium, Jupiter) โ then normalize everything into a unified schema for comparison.
import asyncio
from datetime import datetime
from pydantic import BaseModel
from typing import Optional
class TokenPrice(BaseModel):
"""Unified price record across exchanges."""
token_symbol: str
token_name: str
exchange: str
exchange_type: str # "cex" | "dex"
chain: Optional[str] # For DEX: ethereum, solana, base, etc.
price_usd: float
price_btc: Optional[float]
volume_24h: float
market_cap: Optional[float]
bid: Optional[float]
ask: Optional[float]
spread_pct: Optional[float]
liquidity_usd: Optional[float] # For DEX pools
last_updated: datetime
source_url: str
class OrderBookSnapshot(BaseModel):
"""Order book depth snapshot."""
token_pair: str
exchange: str
bids: list # [(price, quantity), ...]
asks: list
bid_depth_1pct: float # Total bid liquidity within 1%
ask_depth_1pct: float
mid_price: float
spread_bps: float # Spread in basis points
timestamp: datetime
class ArbitrageOpportunity(BaseModel):
"""Cross-exchange price discrepancy."""
token: str
buy_exchange: str
buy_price: float
sell_exchange: str
sell_price: float
spread_pct: float
estimated_profit_usd: float
buy_liquidity: float
sell_liquidity: float
viable: bool # Accounts for fees, withdrawal times
detected_at: datetime
What to Scrape
Your agent should target these data sources for comprehensive price intelligence:
- CoinGecko / CoinMarketCap โ Aggregated prices across 500+ exchanges, market caps, volume rankings, trending tokens
- Exchange frontends โ Binance, Coinbase, Kraken, OKX, Bybit, KuCoin for real-time order books and trade history
- DEX aggregators โ DeFiLlama, DEX Screener, GeckoTerminal for on-chain prices, pool liquidity, new pair listings
- Uniswap / Raydium / Jupiter โ Direct pool data: reserves, fees, impermanent loss calculations
- Stablecoin monitors โ USDT/USDC/DAI peg deviations, redemption queues, backing composition
from mantis import MantisClient
client = MantisClient(api_key="your_api_key")
async def scrape_multi_exchange_prices(tokens: list[str]):
"""Scrape prices for tokens across major exchanges."""
# CoinGecko aggregated data
coingecko_result = await client.scrape(
url=f"https://www.coingecko.com/en/coins/{tokens[0]}",
extract={
"price_usd": "Current USD price",
"market_cap": "Total market capitalization",
"volume_24h": "24-hour trading volume",
"price_change_24h": "24h price change percentage",
"exchanges": [{
"exchange_name": "Exchange name",
"pair": "Trading pair",
"price": "Price on this exchange",
"volume": "Volume on this exchange",
"spread": "Bid-ask spread percentage"
}]
}
)
# DEX Screener for on-chain prices
dex_result = await client.scrape(
url=f"https://dexscreener.com/search?q={tokens[0]}",
extract={
"pools": [{
"chain": "Blockchain name",
"dex": "DEX name (Uniswap, Raydium, etc.)",
"pair": "Token pair",
"price_usd": "Current price in USD",
"liquidity": "Pool liquidity in USD",
"volume_24h": "24h volume",
"price_change_24h": "24h change",
"fdv": "Fully diluted valuation",
"pool_age": "When pool was created"
}]
}
)
return {
"aggregated": coingecko_result,
"dex_pools": dex_result
}
async def detect_arbitrage(token: str, min_spread: float = 0.5):
"""Find cross-exchange arbitrage opportunities."""
prices = await scrape_multi_exchange_prices([token])
all_prices = []
for exchange_data in prices["aggregated"]["exchanges"]:
all_prices.append({
"exchange": exchange_data["exchange_name"],
"price": float(exchange_data["price"]),
"volume": float(exchange_data["volume"]),
"type": "cex"
})
for pool in prices["dex_pools"]["pools"]:
all_prices.append({
"exchange": f"{pool['dex']} ({pool['chain']})",
"price": float(pool["price_usd"]),
"volume": float(pool["volume_24h"]),
"type": "dex"
})
# Find spreads
opportunities = []
for i, buy in enumerate(all_prices):
for sell in all_prices[i+1:]:
spread = abs(buy["price"] - sell["price"]) / min(buy["price"], sell["price"]) * 100
if spread >= min_spread:
opportunities.append(ArbitrageOpportunity(
token=token,
buy_exchange=buy["exchange"] if buy["price"] < sell["price"] else sell["exchange"],
buy_price=min(buy["price"], sell["price"]),
sell_exchange=sell["exchange"] if sell["price"] > buy["price"] else buy["exchange"],
sell_price=max(buy["price"], sell["price"]),
spread_pct=spread,
estimated_profit_usd=spread * min(buy["volume"], sell["volume"]) / 100,
buy_liquidity=buy["volume"],
sell_liquidity=sell["volume"],
viable=spread > 1.0, # Account for fees
detected_at=datetime.now()
))
return sorted(opportunities, key=lambda x: x.spread_pct, reverse=True)
Step 2: DeFi Protocol Monitoring & TVL Tracking
DeFi protocols are the backbone of crypto's financial infrastructure. Monitoring TVL changes, yield rates, and protocol health gives you early warning signals for market movements.
class DeFiProtocol(BaseModel):
"""DeFi protocol metrics snapshot."""
protocol_name: str
chain: str
category: str # "lending", "dex", "yield", "bridge", "liquid-staking"
tvl_usd: float
tvl_change_24h: float
tvl_change_7d: float
fees_24h: float
revenue_24h: float
active_users_24h: int
token_price: Optional[float]
token_mcap_tvl_ratio: Optional[float] # Valuation metric
top_pools: list # Highest-yield pools
audit_status: str # "audited", "unaudited", "partial"
timestamp: datetime
class YieldOpportunity(BaseModel):
"""Yield farming opportunity."""
protocol: str
chain: str
pool_name: str
token_pair: str
apy: float
tvl: float
il_risk: str # "none", "low", "medium", "high"
smart_contract_risk: str
is_incentivized: bool # Emissions-based vs organic yield
base_apy: float # Organic yield only
reward_apy: float # Token emission yield
timestamp: datetime
class WhaleTransaction(BaseModel):
"""Large wallet movement."""
tx_hash: str
chain: str
from_label: Optional[str] # "Binance Hot Wallet", "Unknown Whale", etc.
to_label: Optional[str]
token: str
amount: float
value_usd: float
tx_type: str # "transfer", "swap", "bridge", "deposit", "withdrawal"
exchange_flow: Optional[str] # "inflow" (selling signal) | "outflow" (accumulation)
timestamp: datetime
DeFi Data Sources
- DeFiLlama โ TVL rankings, yield aggregator, protocol revenue, chain comparisons, stablecoin flows
- Dune Analytics โ Community dashboards with on-chain metrics, protocol-specific analytics
- Token Terminal โ Revenue, earnings, P/E ratios for protocols (treating them like companies)
- DeBank โ Whale wallet tracking, portfolio composition, protocol interactions
- Etherscan / Solscan / BaseScan โ Block explorers for transaction-level data, contract interactions
async def monitor_defi_protocols():
"""Track DeFi protocol health and yield opportunities."""
# DeFiLlama TVL data
tvl_data = await client.scrape(
url="https://defillama.com/",
extract={
"protocols": [{
"name": "Protocol name",
"chain": "Primary chain",
"category": "Protocol category",
"tvl": "Total value locked",
"change_1d": "24h TVL change",
"change_7d": "7d TVL change",
"fees_24h": "24h fees generated",
"revenue_24h": "24h revenue to protocol"
}]
}
)
# DeFiLlama yields
yield_data = await client.scrape(
url="https://defillama.com/yields",
extract={
"pools": [{
"protocol": "Protocol name",
"chain": "Chain",
"pool": "Pool name",
"tvl": "Pool TVL",
"apy": "Current APY",
"base_apy": "Base APY (organic)",
"reward_apy": "Reward APY (emissions)",
"il_risk": "Impermanent loss risk level"
}]
}
)
# Detect TVL anomalies (potential exploit or bank run)
alerts = []
for protocol in tvl_data["protocols"]:
change = float(protocol.get("change_1d", "0").replace("%", ""))
if change < -15:
alerts.append({
"type": "TVL_CRASH",
"protocol": protocol["name"],
"change": change,
"severity": "critical" if change < -30 else "warning",
"action": "Investigate: possible exploit, depeg, or bank run"
})
return {
"protocols": tvl_data,
"yields": yield_data,
"alerts": alerts
}
async def track_whale_wallets(wallets: list[str]):
"""Monitor whale wallet activity via block explorers."""
activities = []
for wallet in wallets:
# Check Etherscan for recent transactions
result = await client.scrape(
url=f"https://etherscan.io/address/{wallet}",
extract={
"balance_eth": "ETH balance",
"balance_usd": "USD value of holdings",
"recent_txs": [{
"hash": "Transaction hash",
"method": "Transaction method/type",
"to": "Recipient address",
"value": "Transaction value",
"token": "Token transferred (if token transfer)",
"age": "How long ago"
}],
"token_holdings": [{
"token": "Token name",
"balance": "Token balance",
"value_usd": "USD value"
}]
}
)
activities.append({
"wallet": wallet,
"data": result
})
return activities
Step 3: Social Sentiment & Alpha Scraping
In crypto, social sentiment moves markets more than fundamentals. Your agent needs to monitor Crypto Twitter (CT), Reddit, Telegram, and Discord for alpha signals.
class CryptoSentiment(BaseModel):
"""Social sentiment snapshot for a token."""
token: str
platform: str # "twitter", "reddit", "telegram", "discord"
mentions_1h: int
mentions_24h: int
mention_change_pct: float # vs previous 24h
sentiment_score: float # -1.0 to 1.0
bullish_pct: float
bearish_pct: float
neutral_pct: float
top_influencers_mentioning: list[str]
trending_narratives: list[str]
fud_signals: list[str]
timestamp: datetime
class TokenLaunchAlert(BaseModel):
"""New token or exchange listing alert."""
token_name: str
token_symbol: str
chain: str
event_type: str # "dex_launch", "cex_listing", "airdrop", "token_unlock"
exchange_or_dex: str
initial_liquidity: Optional[float]
initial_mcap: Optional[float]
contract_address: Optional[str]
is_verified: bool
honeypot_check: str # "safe", "suspicious", "honeypot"
social_buzz_score: float
detected_at: datetime
Sentiment Sources
- LunarCrush / Santiment โ Social volume, sentiment scoring, influencer tracking, trending tokens
- CryptoPanic โ News aggregator with community sentiment voting (bullish/bearish)
- Reddit (r/cryptocurrency, r/defi, r/ethfinance) โ Discussion trends, FUD detection, narrative shifts
- Twitter/X โ Crypto influencer posts, trending $cashtags, engagement spikes
- Telegram / Discord โ Alpha groups, project announcements, community sentiment
async def scrape_crypto_sentiment(token: str):
"""Aggregate social sentiment across platforms."""
# CryptoPanic news sentiment
news_sentiment = await client.scrape(
url=f"https://cryptopanic.com/news/{token.lower()}/",
extract={
"articles": [{
"title": "Article headline",
"source": "News source",
"votes_bullish": "Number of bullish votes",
"votes_bearish": "Number of bearish votes",
"votes_important": "Important votes",
"time_ago": "When published"
}]
}
)
# LunarCrush social metrics
social_metrics = await client.scrape(
url=f"https://lunarcrush.com/coins/{token.lower()}",
extract={
"social_volume": "Total social mentions",
"social_dominance": "% of total crypto social volume",
"sentiment": "Overall sentiment score",
"galaxy_score": "LunarCrush Galaxy Score",
"alt_rank": "AltRank position",
"influencer_mentions": [{
"influencer": "Account name",
"followers": "Follower count",
"sentiment": "Bullish/bearish"
}],
"trending_topics": "Related trending topics"
}
)
# Reddit discussion volume
reddit_data = await client.scrape(
url=f"https://www.reddit.com/r/cryptocurrency/search/?q={token}&sort=new",
extract={
"posts": [{
"title": "Post title",
"upvotes": "Number of upvotes",
"comments": "Number of comments",
"sentiment": "Overall tone (bullish/bearish/neutral)",
"time_ago": "When posted"
}]
}
)
return {
"news": news_sentiment,
"social": social_metrics,
"reddit": reddit_data
}
async def detect_new_token_launches():
"""Monitor for new token launches and exchange listings."""
# DEX Screener new pairs
new_pairs = await client.scrape(
url="https://dexscreener.com/new-pairs",
extract={
"pairs": [{
"chain": "Blockchain",
"dex": "DEX name",
"token_name": "New token name",
"token_symbol": "Symbol",
"pair_age": "How old the pair is",
"price": "Current price",
"liquidity": "Pool liquidity",
"volume": "Trading volume",
"price_change": "Price change since launch",
"txns": "Number of transactions",
"makers": "Number of unique traders"
}]
}
)
# Filter for promising launches (liquidity > $50K, multiple makers)
promising = []
for pair in new_pairs.get("pairs", []):
liquidity = float(pair.get("liquidity", "0").replace("$", "").replace(",", "").replace("K", "000").replace("M", "000000"))
if liquidity > 50000:
promising.append(pair)
return promising
Step 4: Exchange Listing & Token Unlock Tracking
Exchange listings (especially on Binance and Coinbase) and token unlock schedules are among the most predictable price catalysts in crypto.
class ExchangeListing(BaseModel):
"""Exchange listing announcement."""
token: str
exchange: str
listing_type: str # "spot", "futures", "earn", "launchpool"
announcement_date: datetime
listing_date: Optional[datetime]
trading_pairs: list[str]
deposit_open: bool
withdrawal_open: bool
price_at_announcement: Optional[float]
current_price: Optional[float]
price_impact_pct: Optional[float]
class TokenUnlock(BaseModel):
"""Token unlock/vesting event."""
token: str
unlock_date: datetime
unlock_amount: float
unlock_value_usd: float
pct_of_circulating: float
pct_of_total_supply: float
recipient: str # "team", "investor", "ecosystem", "community"
cliff_or_linear: str
historical_dump_pct: Optional[float] # How much price dropped on previous unlocks
async def monitor_exchange_listings():
"""Track new exchange listing announcements."""
# Binance announcements
binance_listings = await client.scrape(
url="https://www.binance.com/en/support/announcement/new-cryptocurrency-listing",
extract={
"announcements": [{
"title": "Announcement title",
"token": "Token being listed",
"date": "Announcement date",
"details": "Listing details (pairs, dates)",
"listing_type": "Spot/Futures/Launchpool"
}]
}
)
# Coinbase listings
coinbase_listings = await client.scrape(
url="https://www.coinbase.com/blog/landing/listing",
extract={
"listings": [{
"token": "Token name",
"date": "Listing date",
"networks": "Supported networks",
"details": "Listing details"
}]
}
)
# Token unlock schedule (TokenUnlocks.app)
unlocks = await client.scrape(
url="https://token.unlocks.app/",
extract={
"upcoming": [{
"token": "Token name",
"unlock_date": "When tokens unlock",
"amount": "Number of tokens",
"value_usd": "USD value of unlock",
"pct_supply": "% of circulating supply",
"type": "Team/Investor/Ecosystem"
}]
}
)
return {
"binance": binance_listings,
"coinbase": coinbase_listings,
"unlocks": unlocks
}
Step 5: AI-Powered Crypto Analysis Engine
Now combine all data streams into a GPT-4o-powered analysis engine that generates actionable intelligence.
CRYPTO_ANALYSIS_PROMPT = """You are a crypto market analyst AI. Analyze the following data and provide:
1. MARKET OVERVIEW: Current market regime (risk-on/risk-off/neutral), BTC dominance trend, and key macro factors
2. ALPHA SIGNALS: Top 3-5 opportunities based on:
- Cross-exchange arbitrage (>1% spread with sufficient liquidity)
- TVL inflows to undervalued protocols (TVL growing but token price lagging)
- Social sentiment divergence (positive sentiment + declining price = potential reversal)
- Token unlocks creating selling pressure (>5% of circulating supply)
- New exchange listings (historically +30-80% within 48h of Binance listing)
3. RISK ALERTS:
- Protocols with >15% TVL decline (potential exploit or bank run)
- Stablecoin depegs (>0.5% deviation)
- Whale exchange inflows (selling signal)
- FUD narratives gaining traction
4. PORTFOLIO ACTIONS: Specific buy/sell/hold recommendations with confidence levels
Data:
- Exchange prices: {prices}
- DeFi protocols: {defi}
- Social sentiment: {sentiment}
- Whale activity: {whales}
- Exchange listings: {listings}
- Token unlocks: {unlocks}
Be specific. Include token names, prices, and actionable thresholds."""
async def run_crypto_analysis():
"""Full crypto intelligence pipeline."""
# Gather all data
prices = await scrape_multi_exchange_prices(["bitcoin", "ethereum", "solana"])
defi = await monitor_defi_protocols()
sentiment = await scrape_crypto_sentiment("BTC")
whales = await track_whale_wallets([
"0x...", # Known whale addresses
])
listings = await monitor_exchange_listings()
# Send to GPT-4o for analysis
analysis = await client.analyze(
prompt=CRYPTO_ANALYSIS_PROMPT.format(
prices=prices,
defi=defi,
sentiment=sentiment,
whales=whales,
listings=listings,
unlocks=listings.get("unlocks", {})
)
)
# Alert on critical findings
if any(alert["severity"] == "critical" for alert in defi.get("alerts", [])):
await send_slack_alert(
channel="#crypto-alerts",
message=f"๐จ CRITICAL: {analysis['risk_alerts']}"
)
return analysis
Build Your Crypto Intelligence Agent
Mantis handles the scraping infrastructure โ anti-bot bypass, JavaScript rendering, rotating proxies โ so you can focus on alpha generation.
Start Free โ 100 calls/monthEnterprise Crypto Data: Cost Comparison
| Provider | Monthly Cost | Coverage | Best For |
|---|---|---|---|
| Chainalysis | $8K-$40K/mo | On-chain analytics, compliance, risk scoring | Compliance teams, institutions |
| Nansen | $3K-$10K/mo | Whale tracking, smart money, token flows | Traders, funds |
| Messari | $2K-$5K/mo | Fundamental analysis, research, screeners | Research analysts |
| Kaiko | $5K-$20K/mo | Market data, order books, trade history | Quant funds, market makers |
| Santiment | $1K-$5K/mo | Social analytics, on-chain, development | Sentiment-driven traders |
| Glassnode | $800-$3K/mo | On-chain metrics, HODL waves, supply dynamics | Bitcoin-focused analysts |
| AI Agent + Mantis | $29-$299/mo | All of the above (public data), customizable | Anyone building crypto tools |
Step 6: Automated Alerting & Dashboard
Set up real-time alerts for the signals that matter most:
async def setup_crypto_alerts():
"""Configure automated alert rules."""
alert_rules = {
"arbitrage": {
"condition": "cross_exchange_spread > 1.5%",
"channels": ["slack", "telegram"],
"cooldown_minutes": 15,
"message": "๐ฐ Arbitrage: Buy {token} on {buy_exchange} at ${buy_price}, sell on {sell_exchange} at ${sell_price} โ {spread}% spread"
},
"whale_movement": {
"condition": "transfer_value > $10M AND direction == 'exchange_inflow'",
"channels": ["slack"],
"cooldown_minutes": 5,
"message": "๐ Whale Alert: {amount} {token} (${value}) moved to {exchange} โ potential sell pressure"
},
"tvl_crash": {
"condition": "tvl_change_24h < -15%",
"channels": ["slack", "telegram"],
"cooldown_minutes": 60,
"message": "๐จ TVL Crash: {protocol} down {change}% in 24h โ TVL now ${tvl}. Investigate for exploit."
},
"new_listing": {
"condition": "exchange IN ['binance', 'coinbase'] AND listing_type == 'spot'",
"channels": ["slack", "telegram"],
"cooldown_minutes": 0,
"message": "๐ข New Listing: {token} listed on {exchange} โ historically +30-80% within 48h"
},
"sentiment_spike": {
"condition": "social_volume_change > 500% AND sentiment > 0.7",
"channels": ["slack"],
"cooldown_minutes": 120,
"message": "๐ Sentiment Spike: {token} social volume up {change}% with {sentiment_score} bullish sentiment"
},
"token_unlock": {
"condition": "unlock_pct_circulating > 5% AND days_until_unlock < 7",
"channels": ["slack"],
"cooldown_minutes": 1440, # Daily
"message": "๐ Token Unlock: {token} unlocking {amount} tokens ({pct}% of supply) on {date} โ ${value} USD"
},
"stablecoin_depeg": {
"condition": "abs(price - 1.0) > 0.005",
"channels": ["slack", "telegram"],
"cooldown_minutes": 30,
"message": "โ ๏ธ Stablecoin Depeg: {stablecoin} trading at ${price} โ {deviation}% off peg"
}
}
return alert_rules
Use Cases by Organization Type
1. Crypto Hedge Funds & Trading Desks
Professional traders need multi-exchange price surveillance, order book depth analysis, and whale tracking to execute informed trades. AI agents provide institutional-grade monitoring at retail prices โ cross-exchange arbitrage detection, liquidation cascade warnings, and funding rate divergence alerts.
2. DeFi Protocol Teams
Protocol teams must monitor competitor TVL, yield rates, and governance proposals. Scrape competitor protocol metrics daily to benchmark performance, track liquidity migration patterns, and detect emerging competitive threats before they impact market share.
3. Crypto VCs & Research Analysts
Investment teams need comprehensive due diligence data: on-chain usage metrics, social traction, developer activity, tokenomics analysis. AI agents compile investment memos automatically by aggregating data across dozens of sources that would take analysts hours to check manually.
4. Compliance & Risk Teams
Exchanges and fintech companies need to monitor for suspicious activity patterns, sanctions exposure, and regulatory developments. While not a replacement for Chainalysis, AI agents can provide supplementary monitoring of public blockchain data, exchange announcements, and regulatory news feeds.
Advanced: Narrative Momentum Engine
Crypto markets are driven by narratives โ AI, RWA, DePIN, restaking, memecoins. Building a narrative detection engine gives you an edge in identifying trending sectors before they peak.
class NarrativeMomentum(BaseModel):
"""Crypto narrative trend analysis."""
narrative: str # "AI tokens", "RWA", "DePIN", "L2s", "Memecoins"
lifecycle_stage: str # "emerging", "growing", "peaking", "declining"
tokens_in_narrative: list[str]
combined_mcap: float
mcap_change_7d: float
social_volume_trend: str # "accelerating", "stable", "decelerating"
institutional_interest: str # Based on fund flow data
key_catalysts: list[str] # Upcoming events driving narrative
risk_factors: list[str]
recommendation: str # "accumulate", "hold", "reduce", "avoid"
confidence: float
async def analyze_narrative_momentum():
"""Identify and rank crypto narratives by momentum."""
narratives = {
"AI Tokens": ["FET", "RENDER", "TAO", "NEAR", "AKT"],
"Real World Assets": ["ONDO", "MKR", "CPOOL", "MPL", "MAPLE"],
"DePIN": ["FIL", "AR", "HNT", "RNDR", "THETA"],
"Layer 2s": ["ARB", "OP", "STRK", "MANTA", "BLAST"],
"Liquid Staking": ["LDO", "RPL", "SWISE", "FXS", "SD"],
"Restaking": ["EIGEN", "ALT", "ETHFI", "PUFFER"],
"Memecoins": ["DOGE", "SHIB", "PEPE", "WIF", "BONK"]
}
results = []
for narrative, tokens in narratives.items():
# Scrape combined metrics
combined_data = []
for token in tokens:
data = await scrape_multi_exchange_prices([token])
sentiment = await scrape_crypto_sentiment(token)
combined_data.append({
"token": token,
"price_data": data,
"sentiment": sentiment
})
results.append({
"narrative": narrative,
"tokens": combined_data
})
# GPT-4o ranks narratives by momentum
ranking = await client.analyze(
prompt=f"""Rank these crypto narratives by current momentum.
For each: lifecycle stage, recommendation, confidence score.
Data: {results}"""
)
return ranking
Getting Started
Building a crypto intelligence agent with Mantis follows this progression:
- Week 1: Set up multi-exchange price scraping (CoinGecko + DEX Screener). Build arbitrage detection.
- Week 2: Add DeFi protocol monitoring (DeFiLlama TVL + yields). Set up TVL crash alerts.
- Week 3: Integrate social sentiment (CryptoPanic + LunarCrush). Build narrative momentum engine.
- Week 4: Add whale tracking, exchange listing alerts, token unlock schedule. Deploy GPT-4o analysis pipeline.
Start Scraping Crypto Data Today
100 free API calls/month. No credit card required. Build your crypto intelligence agent in an afternoon.
Get Your API Key โ