News Feed Forums General Web Scraping How to scrape restaurant reviews from UberEats.com using JavaScript?

  • How to scrape restaurant reviews from UberEats.com using JavaScript?

    Posted by Ricardo Urbain on 12/21/2024 at 5:23 am

    Scraping restaurant reviews from UberEats.com using JavaScript allows you to collect data such as restaurant names, review ratings, and comments. Using Node.js with Puppeteer, you can automate browser interactions to handle dynamic content and extract relevant details. Below is a sample script for scraping UberEats reviews.

    const puppeteer = require('puppeteer');
    (async () => {
        const browser = await puppeteer.launch({ headless: true });
        const page = await browser.newPage();
        const url = 'https://www.ubereats.com/near-me';
        await page.goto(url, { waitUntil: 'networkidle2' });
        const reviews = await page.evaluate(() => {
            const reviewList = [];
            const items = document.querySelectorAll('.restaurant-card');
            items.forEach(item => {
                const name = item.querySelector('.restaurant-name')?.textContent.trim() || 'Name not available';
                const rating = item.querySelector('.review-rating')?.textContent.trim() || 'Rating not available';
                const comment = item.querySelector('.review-comment')?.textContent.trim() || 'Comment not available';
                reviewList.push({ name, rating, comment });
            });
            return reviewList;
        });
        console.log(reviews);
        await browser.close();
    })();
    

    This script navigates to UberEats’s restaurant listing page, waits for content to load, and extracts restaurant names, review ratings, and comments. Pagination handling ensures that you scrape data from multiple pages. Randomizing request timing helps avoid detection by UberEats’s anti-scraping systems.

    Thietmar Beulah replied 4 weeks, 1 day ago 3 Members · 2 Replies
  • 2 Replies
  • Hadriana Misaki

    Member
    12/24/2024 at 6:42 am

    Adding pagination to the UberEats scraper ensures that all restaurant reviews are collected. Reviews and ratings are often distributed across multiple pages, and automating navigation through the “Next” button ensures a complete dataset. Introducing random delays between requests mimics human behavior, reducing the likelihood of detection. Pagination handling makes the scraper more effective in collecting comprehensive review data. This functionality is useful for analyzing customer feedback across different restaurants.

  • Thietmar Beulah

    Member
    01/01/2025 at 11:09 am

    Error handling is essential for maintaining the reliability of the scraper when working with dynamic content on UberEats. Missing elements, such as ratings or comments, should not cause the scraper to fail. Adding conditional checks for null values allows the script to skip problematic entries and log them for review. Regular updates to the script ensure compatibility with any changes to UberEats’s layout. These practices improve the scraper’s robustness and usability over time.

Log in to reply.