
Cryptocurrency prices are like roller coasters—constantly changing, fast-moving, and often unpredictable. But that's the beauty of it, right? The key to capitalizing on these fluctuations is having a reliable, real-time data source at your fingertips. Here's where an automated crypto price tracker comes in. Building one might seem like a large task, but with Python, it's entirely doable. I will show you how to scrape the latest prices of the top 150 cryptocurrencies from Crypto.com, rotate proxies to avoid detection, and export the data into a CSV file. This tracker will update every five minutes, ensuring you always have the most up-to-date information.
First, we need to import a few essential libraries:
import requests
from bs4 import BeautifulSoup
import csv
import time
import random
Requests lets us pull the website content. BeautifulSoup will help us parse the HTML and extract data. CSV is for saving the data. Time and Random will control when we fetch new data and rotate through proxies.
Websites like Crypto.com are no fans of scrapers. If you don't use proxies, your requests might get blocked. Let's set up some proxies to ensure that doesn't happen. Here's a simple proxy setup for non-authenticated proxies:
proxy = {
    "http": "http://Your_proxy_IP_Address:Your_proxy_port",
}
html = requests.get(url, proxies=proxy)
For authenticated proxies, here's how you'd set it up:
proxy = {
    "http": "http://username:password@Your_proxy_IP_Address:Your_proxy_port",
}
html = requests.get(url, proxies=proxy)
You'll need to replace "Your_proxy_IP_Address" and "Your_proxy_port" with your actual proxy details.
Using the same proxy for too many requests is a red flag. The solution? Proxy rotation. Here's a simple function that picks a random proxy from a list to keep things fresh:
# List of proxies
proxies = [ 
    "username:password@Your_proxy_IP_Address:Your_proxy_port1",
    "username:password@Your_proxy_IP_Address:Your_proxy_port2",
    "username:password@Your_proxy_IP_Address:Your_proxy_port3",
]
# Method to rotate proxies
def get_proxy(): 
    proxy = random.choice(proxies)  # Randomly pick one
    return {"http": f'http://{proxy}', "https": f'http://{proxy}'}
Now, every time we send a request, it'll come from a different proxy.
We're after the top 150 cryptocurrencies. Using BeautifulSoup, we'll parse the HTML to grab the coin name, ticker symbol, price, and 24-hour price change. Here's the magic that pulls everything together:
def get_crypto_prices():
    url = "https://crypto.com/price"
    html = requests.get(url, proxies=get_proxy())
    soup = BeautifulSoup(html.text, "html.parser")
    price_rows = soup.find_all('tr', class_='css-1cxc880')  # Locate price rows
    prices = []
    for row in price_rows:
        coin_name_tag = row.find('p', class_='css-rkws3')
        name = coin_name_tag.get_text() if coin_name_tag else "no name entry"
        coin_ticker_tag = row.find('span', class_='css-1jj7b1a')
        ticker = coin_ticker_tag.get_text() if coin_ticker_tag else "no ticker entry"
        
        coin_price_tag = row.find('div', class_='css-b1ilzc')
        price = coin_price_tag.text.strip() if coin_price_tag else "no price entry"
        coin_percentage_tag = row.find('p', class_='css-yyku61')
        percentage = coin_percentage_tag.text.strip() if coin_percentage_tag else "no percentage entry"
        
        prices.append({
            "Coin": name,
            "Ticker": ticker,
            "Price": price,
            "24hr-Percentage": percentage
        })
    return prices
Once we've scraped the data, we need to store it in a CSV file. Here's how we can write the data to a CSV:
def export_to_csv(prices, filename="crypto_prices.csv"):
    with open(filename, "w", newline="") as file:
        fieldnames = ["Coin", "Ticker", "Price", "24hr-Percentage"]
        writer = csv.DictWriter(file, fieldnames=fieldnames)
        writer.writeheader()
        writer.writerows(prices)
Now, let's create the loop that keeps our tracker running. It will grab the data, export it, and then wait for five minutes before the next update.
if __name__ == "__main__":
    while True:
        prices = get_crypto_prices()
        export_to_csv(prices)
        print("Prices updated. Waiting for the next update...")
        time.sleep(300)  # Update every 5 minutes
Here's the complete script that integrates all the steps above:
import requests
from bs4 import BeautifulSoup
import csv
import time
import random
# List of proxies
proxies = [
    "username:password@Your_proxy_IP_Address:Your_proxy_port1",
    "username:password@Your_proxy_IP_Address:Your_proxy_port2",
    "username:password@Your_proxy_IP_Address:Your_proxy_port3",
]
# Proxy rotation function
def get_proxy(): 
    proxy = random.choice(proxies)
    return {"http": f'http://{proxy}', "https": f'http://{proxy}'}
def get_crypto_prices():
    url = "https://crypto.com/price"
    html = requests.get(url, proxies=get_proxy())
    soup = BeautifulSoup(html.content, "html.parser")
    price_rows = soup.find_all('tr', class_='css-1cxc880')
    prices = []
    for row in price_rows:
        coin_name_tag = row.find('p', class_='css-rkws3')
        name = coin_name_tag.get_text() if coin_name_tag else "no name entry"
        coin_ticker_tag = row.find('span', class_='css-1jj7b1a')
        ticker = coin_ticker_tag.get_text() if coin_ticker_tag else "no ticker entry"
        
        coin_price_tag = row.find('div', class_='css-b1ilzc')
        price = coin_price_tag.text.strip() if coin_price_tag else "no price entry"
        coin_percentage_tag = row.find('p', class_='css-yyku61')
        percentage = coin_percentage_tag.text.strip() if coin_percentage_tag else "no percentage entry"
        
        prices.append({
            "Coin": name,
            "Ticker": ticker,
            "Price": price,
            "24hr-Percentage": percentage
        })
    return prices
def export_to_csv(prices, filename="crypto_prices.csv"):
    with open(filename, "w", newline="") as file:
        fieldnames = ["Coin", "Ticker", "Price", "24hr-Percentage"]
        writer = csv.DictWriter(file, fieldnames=fieldnames)
        writer.writeheader()
        writer.writerows(prices)
if __name__ == "__main__":
    while True:
        prices = get_crypto_prices()
        export_to_csv(prices)
        print("Prices updated. Waiting for the next update...")
        time.sleep(300)  # Update every 5 minutes
This Python-based tracker is lightweight, flexible, and easy to tweak. You can add more coins, modify the interval, or change the output format as needed. By rotating proxies and controlling request frequency, we keep our activity low-profile, preventing site blocks. If you want to dive deeper into cryptocurrency tracking, consider adding features like price alerts or incorporating additional data points such as market cap or trading volume. The possibilities are endless.
 Solutions proxy résidentielles de haut niveau
Solutions proxy résidentielles de haut niveau {{item.title}}
                                        {{item.title}}