{"id":3785,"date":"2025-02-20T16:59:21","date_gmt":"2025-02-20T16:59:21","guid":{"rendered":"https:\/\/rayobyte.com\/community\/?p=3785"},"modified":"2025-02-20T16:59:21","modified_gmt":"2025-02-20T16:59:21","slug":"harvesting-electronics-deals-from-cimri-com-using-go-sqlite-collecting-tech-discounts-best-selling-products-and-consumer-ratings","status":"publish","type":"post","link":"https:\/\/rayobyte.com\/community\/harvesting-electronics-deals-from-cimri-com-using-go-sqlite-collecting-tech-discounts-best-selling-products-and-consumer-ratings\/","title":{"rendered":"Harvesting Electronics Deals from Cimri.com Using Go &amp; SQLite: Collecting Tech Discounts, Best-Selling Products, and Consumer Ratings"},"content":{"rendered":"<h2 id=\"harvesting-electronics-deals-from-cimri-com-using-go-sqlite-collecting-tech-discounts-best-selling-products-and-consumer-ratings-fppouiMawJ\">Harvesting Electronics Deals from Cimri.com Using Go &amp; SQLite: Collecting Tech Discounts, Best-Selling Products, and Consumer Ratings<\/h2>\n<p>In the digital age, finding the best deals on electronics can be a daunting task. With countless online platforms offering a myriad of products, consumers often find themselves overwhelmed. Cimri.com, a popular Turkish price comparison site, offers a solution by aggregating deals and consumer ratings. This article explores how to harness the power of Go and SQLite to efficiently scrape and store data from Cimri.com, enabling users to make informed purchasing decisions.<\/p>\n<h3 id=\"understanding-the-basics-of-web-scraping-fppouiMawJ\">Understanding the Basics of Web Scraping<\/h3>\n<p>Web scraping is the process of extracting data from websites. It involves fetching the HTML of a webpage and parsing it to retrieve the desired information. This technique is invaluable for collecting data from e-commerce sites like Cimri.com, where prices and product details are frequently updated.<\/p>\n<p>Using Go, a statically typed, compiled language known for its efficiency and simplicity, we can create a robust web scraper. Go&#8217;s concurrency model allows for efficient handling of multiple requests, making it ideal for scraping large datasets. Additionally, SQLite, a lightweight database engine, provides a simple yet powerful way to store and query the scraped data.<\/p>\n<h3 id=\"setting-up-your-go-environment-fppouiMawJ\">Setting Up Your Go Environment<\/h3>\n<p>Before diving into the code, ensure that you have Go installed on your machine. You can download it from the official Go website. Once installed, set up your workspace by creating a new directory for your project. This will help keep your files organized and make it easier to manage dependencies.<\/p>\n<p>Next, initialize a new Go module in your project directory. This will allow you to manage your project&#8217;s dependencies using Go modules. Run the following command in your terminal:<\/p>\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\">go mod init cimri-scraper\r\n<\/pre>\n<p>With your environment set up, you can now start writing the code to scrape Cimri.com.<\/p>\n<h3 id=\"writing-the-web-scraper-in-go-fppouiMawJ\">Writing the Web Scraper in Go<\/h3>\n<p>To begin, you&#8217;ll need to import the necessary packages. The &#8220;net\/http&#8221; package will allow you to make HTTP requests, while the &#8220;golang.org\/x\/net\/html&#8221; package provides tools for parsing HTML. Additionally, you&#8217;ll use the &#8220;github.com\/mattn\/go-sqlite3&#8221; package to interact with SQLite.<\/p>\n<p>Here&#8217;s a basic structure for your Go web scraper:<\/p>\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\">package main\r\n\r\nimport (\r\n    \"database\/sql\"\r\n    \"fmt\"\r\n    \"net\/http\"\r\n    \"golang.org\/x\/net\/html\"\r\n    _ \"github.com\/mattn\/go-sqlite3\"\r\n)\r\n\r\nfunc main() {\r\n    \/\/ Connect to SQLite database\r\n    db, err := sql.Open(\"sqlite3\", \".\/cimri.db\")\r\n    if err != nil {\r\n        panic(err)\r\n    }\r\n    defer db.Close()\r\n\r\n    \/\/ Create table for storing product data\r\n    createTable(db)\r\n\r\n    \/\/ Fetch and parse data from Cimri.com\r\n    url := \"https:\/\/www.cimri.com\"\r\n    resp, err := http.Get(url)\r\n    if err != nil {\r\n        panic(err)\r\n    }\r\n    defer resp.Body.Close()\r\n\r\n    \/\/ Parse HTML and extract data\r\n    doc, err := html.Parse(resp.Body)\r\n    if err != nil {\r\n        panic(err)\r\n    }\r\n\r\n    \/\/ Extract and store product data\r\n    extractData(doc, db)\r\n}\r\n\r\nfunc createTable(db *sql.DB) {\r\n    query := `\r\n    CREATE TABLE IF NOT EXISTS products (\r\n        id INTEGER PRIMARY KEY AUTOINCREMENT,\r\n        name TEXT,\r\n        price TEXT,\r\n        rating TEXT\r\n    );`\r\n    _, err := db.Exec(query)\r\n    if err != nil {\r\n        panic(err)\r\n    }\r\n}\r\n\r\nfunc extractData(n *html.Node, db *sql.DB) {\r\n    \/\/ Implement data extraction logic here\r\n}\r\n<\/pre>\n<p>This code sets up a connection to an SQLite database and creates a table for storing product data. The `extractData` function will contain the logic for parsing the HTML and extracting the relevant information.<\/p>\n<h3 id=\"extracting-and-storing-data-fppouiMawJ\">Extracting and Storing Data<\/h3>\n<p>To extract data from the HTML, you&#8217;ll need to traverse the DOM tree and identify the elements containing the desired information. This can be done using a recursive function that visits each node in the tree.<\/p>\n<p>For example, if you&#8217;re interested in extracting product names, prices, and ratings, you might look for elements with specific class names or attributes. Once you&#8217;ve identified the relevant elements, you can extract their text content and store it in the database.<\/p>\n<p>Here&#8217;s an example of how you might implement the `extractData` function:<\/p>\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\">func extractData(n *html.Node, db *sql.DB) {\r\n    if n.Type == html.ElementNode &amp;&amp; n.Data == \"div\" {\r\n        for _, a := range n.Attr {\r\n            if a.Key == \"class\" &amp;&amp; a.Val == \"product-info\" {\r\n                \/\/ Extract product details\r\n                name := extractText(n, \"product-name\")\r\n                price := extractText(n, \"product-price\")\r\n                rating := extractText(n, \"product-rating\")\r\n\r\n                \/\/ Insert data into database\r\n                insertData(db, name, price, rating)\r\n            }\r\n        }\r\n    }\r\n\r\n    \/\/ Recursively visit child nodes\r\n    for c := n.FirstChild; c != nil; c = c.NextSibling {\r\n        extractData(c, db)\r\n    }\r\n}\r\n\r\nfunc extractText(n *html.Node, className string) string {\r\n    \/\/ Implement logic to extract text content based on class name\r\n    return \"\"\r\n}\r\n\r\nfunc insertData(db *sql.DB, name, price, rating string) {\r\n    query := `INSERT INTO products (name, price, rating) VALUES (?, ?, ?)`\r\n    _, err := db.Exec(query, name, price, rating)\r\n    if err != nil {\r\n        panic(err)\r\n    }\r\n}\r\n<\/pre>\n<p>This code defines a recursive function that traverses the DOM tree and extracts product details based on class names. The extracted data is then inserted into the SQLite database.<\/p>\n<h3 id=\"analyzing-and-utilizing-the-data-fppouiMawJ\">Analyzing and Utilizing the Data<\/h3>\n<p>Once you&#8217;ve successfully scraped and stored the data, you can use SQL queries to analyze it. For example, you might want to find the top-rated products or identify the best deals based on price reductions.<\/p>\n<p>SQLite provides a powerful query language that allows you to perform complex analyses on your data. You can use SQL functions to calculate averages, find maximum or minimum values, and group data by specific criteria.<\/p>\n<p>Here&#8217;s an example of a query that retrieves the top 5 best-selling products based on consumer ratings:<\/p>\n<pre class=\"EnlighterJSRAW\" data-enlighter-language=\"generic\">SELECT name, price, rating\r\nFROM products\r\nORDER BY rating DESC\r\nLIMIT 5;\r\n<\/pre>\n<p>This query sorts the products by their ratings in descending order and returns the top 5 results. You can modify the query to suit your specific needs and gain valuable insights from the<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Discover how to use Go &amp; SQLite to gather electronics deals from Cimri.com, tracking tech discounts, top products, and consumer ratings efficiently.<\/p>\n","protected":false},"author":419,"featured_media":3971,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"rank_math_lock_modified_date":false,"footnotes":""},"categories":[161],"tags":[],"class_list":["post-3785","post","type-post","status-publish","format-standard","has-post-thumbnail","hentry","category-forum"],"_links":{"self":[{"href":"https:\/\/rayobyte.com\/community\/wp-json\/wp\/v2\/posts\/3785","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/rayobyte.com\/community\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/rayobyte.com\/community\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/rayobyte.com\/community\/wp-json\/wp\/v2\/users\/419"}],"replies":[{"embeddable":true,"href":"https:\/\/rayobyte.com\/community\/wp-json\/wp\/v2\/comments?post=3785"}],"version-history":[{"count":2,"href":"https:\/\/rayobyte.com\/community\/wp-json\/wp\/v2\/posts\/3785\/revisions"}],"predecessor-version":[{"id":3994,"href":"https:\/\/rayobyte.com\/community\/wp-json\/wp\/v2\/posts\/3785\/revisions\/3994"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/rayobyte.com\/community\/wp-json\/wp\/v2\/media\/3971"}],"wp:attachment":[{"href":"https:\/\/rayobyte.com\/community\/wp-json\/wp\/v2\/media?parent=3785"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/rayobyte.com\/community\/wp-json\/wp\/v2\/categories?post=3785"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/rayobyte.com\/community\/wp-json\/wp\/v2\/tags?post=3785"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}