Bulk URL Shortening: Shorten Hundreds of Links in One Operation

Shorten URLs in bulk using the URLW API. Learn how to automate batch link creation with a shell script or code, with rate limit handling and error recovery.

If you manage a large link library — an e-commerce catalogue, a media archive, a content library with thousands of assets — shortening URLs one by one through a web interface is not a viable workflow. The URLW API makes bulk URL shortening straightforward, whether you are migrating an existing link library, generating links for a product catalogue, or automating link creation as part of a data pipeline.

When Bulk Shortening Makes Sense

Common scenarios for bulk URL shortening:

  • E-commerce catalogues: Create short, clean product links for every SKU in your catalogue, suitable for use in SMS campaigns, price comparison sites, or affiliate platforms.
  • Content migrations: When moving a website to a new domain or URL structure, create short links for all existing URLs to maintain compatibility with any previously distributed links.
  • Campaign preparation: Pre-generate hundreds of campaign links from a spreadsheet before a scheduled campaign launch.
  • CSV import: Import a CSV of destination URLs and slugs to create all links in one operation (see the CSV import guide for the web-based workflow).

Shell Script: Bulk Shortening from a CSV File

The following bash script reads a CSV file with columns url,slug and creates a short link for each row using the URLW API:

#!/bin/bash
# bulk-shorten.sh
# Usage: ./bulk-shorten.sh links.csv
# CSV format: url,slug (one per line, no header)

API_TOKEN="${URLW_API_TOKEN}"
API_BASE="https://urlw.fr/api/v1"
INPUT_FILE="${1:-links.csv}"
OUTPUT_FILE="results_$(date +%Y%m%d_%H%M%S).csv"
DELAY=0.5  # seconds between requests (respect rate limits)

echo "url,slug,short_url,status" > "$OUTPUT_FILE"

while IFS=',' read -r url slug; do
# Skip empty lines
[ -z "$url" ] && continue

# Build JSON payload
if [ -n "$slug" ]; then
    payload="{\"url\":\"$url\",\"slug\":\"$slug\"}"
else
    payload="{\"url\":\"$url\"}"
fi

# Make the API request
response=$(curl -s -w "\n%{http_code}" \
    -X POST "${API_BASE}/links" \
    -H "Authorization: Bearer ${API_TOKEN}" \
    -H "Content-Type: application/json" \
    -d "$payload")

http_code=$(echo "$response" | tail -1)
body=$(echo "$response" | head -1)

if [ "$http_code" = "201" ]; then
    short_url=$(echo "$body" | grep -o '"short_url":"[^"]*"' | cut -d'"' -f4)
    echo "${url},${slug},${short_url},created" >> "$OUTPUT_FILE"
    echo "✓ Created: ${short_url}"
elif [ "$http_code" = "429" ]; then
    echo "Rate limited — waiting 60 seconds..."
    sleep 60
    # Retry once
    response=$(curl -s -w "\n%{http_code}" \
        -X POST "${API_BASE}/links" \
        -H "Authorization: Bearer ${API_TOKEN}" \
        -H "Content-Type: application/json" \
        -d "$payload")
    http_code=$(echo "$response" | tail -1)
    body=$(echo "$response" | head -1)
    if [ "$http_code" = "201" ]; then
        short_url=$(echo "$body" | grep -o '"short_url":"[^"]*"' | cut -d'"' -f4)
        echo "${url},${slug},${short_url},created" >> "$OUTPUT_FILE"
    else
        echo "${url},${slug},,error_${http_code}" >> "$OUTPUT_FILE"
    fi
else
    error=$(echo "$body" | grep -o '"message":"[^"]*"' | cut -d'"' -f4)
    echo "${url},${slug},,error_${http_code}: ${error}" >> "$OUTPUT_FILE"
    echo "✗ Failed (${http_code}): ${url} — ${error}"
fi

sleep "$DELAY"

done < "$INPUT_FILE"

echo ""
echo "Done. Results saved to: $OUTPUT_FILE"

Sample Input CSV

https://monsite.fr/produit/chaise-ergonomique-pro,chaise-ergo
https://monsite.fr/produit/bureau-standing-desk,bureau-standing
https://monsite.fr/produit/ecran-4k-27-pouces,ecran-4k
https://monsite.fr/categorie/mobilier-bureau,mobilier

Python Alternative for Larger Datasets

For larger bulk operations (thousands of links), Python with the requests library provides better error handling, retry logic, and progress tracking:

import csv, os, time, requests

API_TOKEN = os.environ['URLW_API_TOKEN']
BASE_URL  = 'https://urlw.fr/api/v1'
HEADERS   = {'Authorization': f'Bearer {API_TOKEN}', 'Content-Type': 'application/json'}

def create_link(url, slug=None, retries=3):
payload = {'url': url}
if slug: payload['slug'] = slug
for attempt in range(retries):
    r = requests.post(f'{BASE_URL}/links', json=payload, headers=HEADERS, timeout=10)
    if r.status_code == 201:
        return r.json()
    if r.status_code == 429:
        wait = int(r.headers.get('X-RateLimit-Reset', time.time() + 60)) - int(time.time())
        time.sleep(max(wait, 5))
    else:
        return {'error': r.json().get('message', r.status_code)}
return {'error': 'max_retries_exceeded'}

with open('links.csv') as f, open('results.csv', 'w', newline='') as out:
reader = csv.reader(f)
writer = csv.writer(out)
writer.writerow(['url', 'slug', 'short_url', 'status'])
for url, *rest in reader:
    slug = rest[0] if rest else None
    result = create_link(url, slug)
    status = 'created' if 'short_url' in result else result.get('error', 'unknown')
    writer.writerow([url, slug, result.get('short_url', ''), status])
    time.sleep(0.2)

Get Started

Create your account at /en/register and review API documentation at /en/docs/api. Pro plans with higher API rate limits are available at /en/#pricing.

Try URLW for free

50 short links, REST API included, no credit card required.