How to Detect What Technology a Website Uses
Every website leaves traces of its tech stack in its HTML, headers, and network requests. Here's how to read those signals — from quick checks to deep analysis.
Why detect technology?
Common reasons people look up a website's tech stack:
- Sales prospecting — find out if a target company uses (or doesn't use) your type of product
- Competitive research — see what tools successful competitors are using
- Due diligence — evaluate a company's technical maturity before a partnership or acquisition
- Development — understand what framework or CMS a site uses before quoting a project
- Security research — identify outdated software versions that might have vulnerabilities
Method 1: Browser extensions (easiest)
Install one of these and every website you visit will show its detected technologies:
- Wappalyzer — the most popular. Detects CMS, frameworks, analytics, CDN, payment tools, and more. Free for individual use, paid for bulk lookups and API.
- WhatRuns — similar to Wappalyzer with a slightly different detection database.
- BuiltWith browser extension — free companion to the BuiltWith website. Shows a quick summary when you visit any site.
How they work: These extensions inspect the page's HTML, JavaScript libraries, HTTP headers, cookies, and DOM structure. They match patterns against a database of known technology signatures.
Limitations: They only work one site at a time. They can miss technologies that load conditionally (only on checkout pages, only for logged-in users). And they can produce false positives if a site includes remnant code from a technology it no longer actively uses.
Method 2: Online lookup tools
Enter a URL and get a technology report:
- BuiltWith.com — the deepest free lookup. Shows current and historical technology usage, including analytics, hosting, ad networks, CMS, and frameworks.
- Wappalyzer.com/lookup — online version of the extension. Clean interface, categorized results.
- W3Techs.com — focuses on web technology market share and trends. Good for understanding how popular a technology is, not for individual lookups.
Best for: Quick one-off checks when you don't want to install anything.
Method 3: View source (manual)
Right-click → View Page Source, then look for these telltale signs:
CMS detection
- Shopify:
cdn.shopify.comin script/link tags,Shopify.themein JavaScript - WordPress/WooCommerce:
/wp-content/and/wp-includes/paths,meta name="generator" content="WordPress" - Magento:
/static/frontend/paths,mage/cookiesin scripts - Squarespace:
static1.squarespace.comorsqspclasses - Wix:
static.wixstatic.com,wix-code-sdk
Analytics and tracking
- Google Analytics 4:
gtag('config', 'G-XXXXXXX')orgoogletagmanager.com - Meta Pixel:
fbq('init'orconnect.facebook.net - Hotjar:
static.hotjar.com - Klaviyo:
static.klaviyo.comor_learnqvariable
Payment providers
- Stripe:
js.stripe.com - PayPal:
paypal.com/sdkorpaypalobjects.com - Klarna:
klarna.comscripts
Pros: Works on any site, no tools needed, shows exactly what's in the HTML.
Cons: Time-consuming. Requires knowledge of what to look for. Misses server-side technologies (backend language, database, hosting) that don't leave client-side traces.
Method 4: HTTP headers
Open your browser's developer tools (F12) → Network tab → reload the page → click the first request → Headers tab. Look for:
Server:— might saynginx,Apache,cloudflare, etc.X-Powered-By:— sometimes revealsPHP/8.2,Express,ASP.NETX-Shopify-Stage:— confirms ShopifyX-WordPress:orLink: ... wp-json— confirms WordPressSet-Cookie:— cookie names reveal platforms (e.g.,_shopify_s,wp_prefixes)
Many modern sites strip these headers for security, so this doesn't always work — but when it does, it's authoritative.
Method 5: DNS and infrastructure
For hosting and infrastructure detection:
- DNS lookup —
dig example.comor use an online tool. CNAME records reveal hosting (e.g.,shops.myshopify.com,*.wpengine.com). - IP lookup — resolve the IP and check who owns it. Cloud providers (AWS, GCP, Cloudflare) are identifiable by IP range.
- SSL certificate — click the padlock in your browser. The issuer and subject alternative names can reveal hosting platforms.
Method 6: Bulk technology detection at scale
If you need to check hundreds or thousands of websites, individual lookups don't scale. Options:
- Wappalyzer API — programmatic access to their detection engine. Paid plans from $250/month.
- BuiltWith API — similar, with historical data. Enterprise pricing.
- Store databases — platforms like Store Leads, BuiltWith, or Veltima pre-crawl websites and let you filter by technology without doing any detection yourself.
- Build your own — open-source tools like Webanalyze (Go) or python-Wappalyzer can be self-hosted, but you need to manage crawling infrastructure and keep detection patterns updated.
What technology detection can't tell you
Important limitations to understand:
- Backend languages — if a site uses Python, Ruby, or Go on the backend, there's often no client-side trace. You might detect the framework (Django, Rails) but not always.
- Databases — MySQL vs PostgreSQL vs MongoDB is invisible from the outside.
- Internal tools — CRM, ERP, project management tools leave no trace on the public website.
- Removed technologies — remnant code (an old analytics snippet, an abandoned A/B test) can cause false positives. Just because the code is in the HTML doesn't mean the tool is actively used.
- Server-side rendered content — some technologies only appear on specific pages (checkout, account, admin) that aren't publicly accessible.
Quick reference: detection methods compared
| Method | Speed | Depth | Scale | Cost |
|---|---|---|---|---|
| Browser extension | Instant | Good | 1 site | Free |
| Online lookup | Seconds | Good | 1 site | Free |
| View source | Minutes | Deep | 1 site | Free |
| HTTP headers | Seconds | Limited | 1 site | Free |
| API (Wappalyzer/BuiltWith) | Seconds | Good | Bulk | $250+/mo |
| Pre-crawled database | Instant | Varies | Bulk | $0-250/mo |
Skip the manual detective work
Veltima detects 214+ technologies across thousands of e-commerce stores. Search, filter, and export — no crawling required.
Try it free