News Feed Forums General Web Scraping Scrape delivery options from Dunelm UK using Go

  • Scrape delivery options from Dunelm UK using Go

    Posted by Lela Sandy on 12/13/2024 at 7:22 am

    Scraping delivery options from Dunelm UK involves setting up a Go application using the Colly library to efficiently parse and extract HTML content. Delivery options are typically displayed on product pages, often under a dedicated section that highlights shipping methods, costs, and estimated delivery times. This information is crucial for customers and can include options such as standard delivery, next-day delivery, or click-and-collect services.
    To begin, you inspect the webpage using browser developer tools to identify the specific tags and classes that house delivery-related information. Once identified, you can use Colly’s OnHTML method to target these elements and extract their content. Handling edge cases, such as products without delivery details or location-specific options, ensures the scraper functions reliably across different pages.
    Below is a complete Go implementation for scraping delivery options from Dunelm UK:

    package main
    import (
    	"fmt"
    	"log"
    	"github.com/gocolly/colly"
    )
    func main() {
    	// Create a new collector
    	c := colly.NewCollector()
    	// Scrape delivery options
    	c.OnHTML(".delivery-options", func(e *colly.HTMLElement) {
    		deliveryOptions := e.Text
    		fmt.Println("Delivery Options:")
    		fmt.Println(deliveryOptions)
    	})
    	// Handle errors
    	c.OnError(func(_ *colly.Response, err error) {
    		log.Println("Error occurred:", err)
    	})
    	// Visit the Dunelm product page
    	err := c.Visit("https://www.dunelm.com/product-page")
    	if err != nil {
    		log.Fatalf("Failed to visit website: %v", err)
    	}
    }
    
    Soheil Sarala replied 3 days, 14 hours ago 4 Members · 3 Replies
  • 3 Replies
  • Aretha Melech

    Member
    12/14/2024 at 8:39 am

    The script could be improved by adding error handling for pages where the delivery options section might not exist. A simple check could be added to log cases where no delivery information is found.

  • Ivo Joris

    Member
    12/18/2024 at 6:57 am

    Adding functionality to store the scraped delivery options in a database or exporting them to a CSV file would make the script more versatile. This would allow for better data management and further analysis.

  • Soheil Sarala

    Member
    12/19/2024 at 5:28 am

    The script could be extended to handle pagination or extract delivery information for multiple products by dynamically following links from a category page. This would enable the collection of a more comprehensive dataset.

Log in to reply.