News Feed Forums General Web Scraping How to extract fundraiser details from GoFundMe.com using Python?

  • How to extract fundraiser details from GoFundMe.com using Python?

    Posted by Lenz Dominic on 12/20/2024 at 9:38 am

    Scraping fundraiser details from GoFundMe.com using Python can be helpful for analyzing campaigns, such as gathering information on titles, goals, and amounts raised. By using Python’s requests library for sending HTTP requests and BeautifulSoup for parsing HTML, you can efficiently extract relevant data. The process involves sending a GET request to the page, parsing the HTML response, and targeting elements containing the fundraiser details. Below is a sample Python script for scraping fundraisers from GoFundMe.

    import requests
    from bs4 import BeautifulSoup
    # Target URL for GoFundMe campaigns
    url = "https://www.gofundme.com/discover"
    headers = {
        "User-Agent": "Mozilla/5.0"
    }
    response = requests.get(url, headers=headers)
    if response.status_code == 200:
        soup = BeautifulSoup(response.content, "html.parser")
        campaigns = soup.find_all("div", class_="campaign-card")
        for campaign in campaigns:
            title = campaign.find("h3").text.strip() if campaign.find("h3") else "Title not available"
            goal = campaign.find("span", class_="goal").text.strip() if campaign.find("span", class_="goal") else "Goal not available"
            raised = campaign.find("span", class_="raised").text.strip() if campaign.find("span", class_="raised") else "Amount raised not available"
            print(f"Title: {title}, Goal: {goal}, Raised: {raised}")
    else:
        print("Failed to fetch GoFundMe page.")
    

    This script extracts titles, funding goals, and amounts raised from GoFundMe campaigns. It can be extended to handle pagination by identifying and navigating through additional campaign pages. Adding random delays between requests reduces the risk of being detected by anti-scraping measures. Proper error handling ensures smooth operation even when elements are missing.

    Taliesin Clisthenes replied 3 weeks, 6 days ago 3 Members · 2 Replies
  • 2 Replies
  • Katerina Renata

    Member
    12/25/2024 at 7:48 am

    To improve the scraper’s efficiency, implementing pagination allows for the collection of more campaign data. GoFundMe often lists a limited number of fundraisers per page, so scraping all available pages ensures a more complete dataset. By automating navigation through “Next” buttons, the scraper can capture additional campaigns. Introducing random delays between requests further reduces the likelihood of detection. This functionality ensures the scraper collects data comprehensively.

  • Taliesin Clisthenes

    Member
    01/03/2025 at 7:33 am

    Adding robust error handling ensures that the scraper runs smoothly even if GoFundMe updates its page layout. For example, if elements like goals or raised amounts are missing, the scraper should log these cases without crashing. Using conditional checks for null values or try-catch blocks allows the script to continue working effectively. Logging skipped campaigns also helps identify potential improvements to the scraper. These practices ensure long-term reliability and adaptability.

Log in to reply.