Integration

Pipeline Integration

Urx is designed to work seamlessly in command-line pipelines and with other security tools.

Standard Input/Output

Urx reads domains from standard input and outputs URLs to standard output, making it perfect for piping:

cat domains.txt | urx | grep "api"

With Security Tools

Nuclei

Scan for vulnerabilities in discovered JavaScript files:

urx example.com -e js | nuclei -t xss

httpx

Probe discovered URLs for HTTP information:

urx example.com | httpx -silent -status-code

gf (Go Filters)

Filter URLs for specific patterns:

urx example.com | gf xss
urx example.com | gf redirect
urx example.com | gf ssrf

ffuf

Fuzz discovered endpoints:

urx example.com --patterns api | ffuf -w - -u FUZZ

waybackurls / gau

Combine with other URL collection tools:

(urx example.com && gau example.com) | sort -u

Notification Integration

Notify

Send new URLs to various notification channels:

urx target.com --incremental --silent | notify -silent

Discord Webhook

urx example.com | while read url; do
  curl -X POST "webhook_url" -d "{\"content\":\"$url\"}"
done

Database Integration

PostgreSQL

Store results in a database:

urx example.com -f json | jq -r '.url' | while read url; do
  psql -c "INSERT INTO urls (url) VALUES ('$url')"
done

MongoDB

urx example.com -f json | mongoimport --db security --collection urls

Continuous Monitoring

Daily Cron Job

Monitor targets daily for new URLs:

# Add to crontab
0 0 * * * /usr/local/bin/urx target.com --incremental --silent >> /var/log/urx.log

With Redis for Distributed Scanning

urx example.com --cache-type redis --redis-url redis://central-cache:6379 --incremental

CI/CD Integration

GitHub Actions

name: URL Discovery
on:
  schedule:
    - cron: '0 0 * * *'
jobs:
  discover:
    runs-on: ubuntu-latest
    steps:
      - name: Install Urx
        run: cargo install urx
      - name: Run Discovery
        run: urx example.com --incremental -o results.txt
      - name: Upload Results
        uses: actions/upload-artifact@v3
        with:
          name: urls
          path: results.txt

API Integration

Custom Processing

urx example.com -f json | python3 process_urls.py

Example Python script:

import json
import sys

for line in sys.stdin:
    data = json.loads(line)
    # Process URL data
    print(f"Processing: {data['url']}")

Docker Integration

Run in Container

docker run --rm \
  -v $(pwd):/data \
  ghcr.io/hahwul/urx:latest \
  example.com -o /data/results.txt

Docker Compose for Monitoring Stack

version: '3'
services:
  urx:
    image: ghcr.io/hahwul/urx:latest
    command: example.com --cache-type redis --redis-url redis://redis:6379 --incremental
    depends_on:
      - redis
  redis:
    image: redis:alpine
    volumes:
      - redis-data:/data
volumes:
  redis-data:

Kubernetes CronJob

apiVersion: batch/v1
kind: CronJob
metadata:
  name: urx-scanner
spec:
  schedule: "0 0 * * *"
  jobTemplate:
    spec:
      template:
        spec:
          containers:
          - name: urx
            image: ghcr.io/hahwul/urx:latest
            args: ["example.com", "--incremental", "--silent"]
          restartPolicy: OnFailure

Output Format Integration

JSON for jq Processing

urx example.com -f json | jq '.[] | select(.status == 200)'

CSV for Spreadsheet Import

urx example.com -f csv -o results.csv
# Import into Excel, Google Sheets, etc.

Multi-Tool Workflows

Complete Reconnaissance Pipeline

#!/bin/bash
TARGET=$1

# Discover URLs
urx $TARGET --subs -e js,json,xml -o urls.txt

# Probe for live URLs
cat urls.txt | httpx -silent -o live.txt

# Scan for vulnerabilities
cat live.txt | nuclei -t cves/ -o vulnerabilities.txt

# Check for secrets in JS files
cat urls.txt | grep "\.js$" | while read url; do
  curl -s $url | grep -i "api.*key"
done

Bug Bounty Automation

#!/bin/bash
TARGET=$1

# Initial discovery
urx $TARGET --subs --incremental -o new-urls.txt

# Filter interesting endpoints
cat new-urls.txt | gf redirect > potential-redirects.txt
cat new-urls.txt | gf xss > potential-xss.txt
cat new-urls.txt | gf sqli > potential-sqli.txt

# Notify on Slack
if [ -s new-urls.txt ]; then
  COUNT=$(wc -l < new-urls.txt)
  curl -X POST $SLACK_WEBHOOK -d "{\"text\":\"Found $COUNT new URLs for $TARGET\"}"
fi