I built a Web-Scraper API that is 6-7x more efficient than current ones

https://scrapewithruno.com/

#HackerNews #webscraper #API #efficiency #techinnovation #programming #tools

Runo - Web Scraping API | Any URL to Typed JSON

Runo extracts structured, typed JSON from any URL using AI. Define your schema, call the API, get clean data. Handles JS rendering, bot bypass, and batch crawling.

Runo
Maxun v0.0.32 ra mắt với tính năng AI-native và ghi âm thời gian thực, mã nguồn mở, cho phép tự lưu trữ và trích xuất dữ liệu web không cần code. Hỗ trợ tích hợp với LlamaIndex, LangChain, OpenAI SDK, và nhiều framework AI khác qua SDK. Chế độ AI Extract tự động điều hướng, không cần URL. Ghi âm thời gian thực chính xác với hành động: gõ, click, cuộn, điều hướng. Phù hợp xây dựng workflow và agent thông minh. #Maxun #WebScraper #AIIntegration #OpenSource #DataExtraction #TríchXuấtDữLiệu #AI #MãN

Cory – Blocking Countries because of scrapers

What the title says, Cory is blocking countries due to misbehaved scrapers. We do a bit of this at work, blocking misbehaving countries when they flood our sites with traffic. There is very little reason that anyone should be visiting the website of a city unless they live in the city, maybe the city next door, and maybe the originating country of the city.

99% of the time when a site gets DDOSed by something it’s coming from somewhere outside the country. The leading countries are India, China and North Korea. Sure a single person, or a family could be researching a city, but that doesn’t explain the traffic floods.

Many of our customers use Cloudflare so we just block them at the Cloudflare level and call it a day. I go back after a few weeks and remove the block because some valid traffic is reasonable.

I had to take a line like that on my own site as well, block a bunch of offending scrapers and bots from countries. It sucks to stop regular people from visiting my site but I’ve already dealt with a bill of $5k in a month that should have been $50 and I don’t need another one.

#webCrawler #webScraper
Blocking entire countries because of scrapers

Cory Dransfeldt
Cory – Blocking Countries because of scrapers
What the title says, Cory is blocking countries due to misbehaved scrapers. We do a bit of this at work, blocking misbehaving countries when they flood our sites with traffic. There is very little reason that anyone should be visiting the website of a city unless they live in the city, maybe the city next door, and maybe
https://curtismchale.ca/2026/01/14/cory-blocking-countries-because-of-scrapers/
#LinksOfInterest #WebCrawler #WebScraper
Cory – Blocking Countries because of scrapers – Curtis McHale

Bright Data’s new API lets developers weave AI/ML models, LLMs and generative AI directly into web‑scraping workflows while keeping bots at bay. JavaScript‑ready, open‑source friendly, and built for seamless anti‑bot protection. Dive into the benchmarks and see how it powers smarter data pipelines. #BrightDataAPI #AIintegration #AntiBot #WebScraper

🔗 https://aidailypost.com/news/bright-data-api-delivers-seamless-aiml-integration-antibot-protection

🔍 / #software / #automation / #scraping

#WebScraper - The #1 web scraping extension

The most popular web scraping extension. Start scraping in minutes. Automate your tasks with our Cloud Scraper. No software to download, no coding needed.

🐱🔗 https://laravista.altervista.org/CatLink/links/454

#catlink #softwareautomation

Web Scraper - The #1 web scraping extension

The most popular web scraping extension. Start scraping in minutes. Automate your tasks with our Cloud Scraper. No software to download, no coding needed.

🤖 Ottenere l'elenco di tutte le immagini di una pagina HTML con PHP
Sviluppare un web scraper con PHP per ottenere l'elenco completo di tutte le immagini di un url...

👉 https://www.selectallfromdual.com/blog/1639

#html #php #webscraper

Ottenere l’elenco di tutte le immagini di una pagina HTML con PHP

Sviluppare un web scraper con PHP per ottenere l'elenco completo di tutte le immagini di un url

DUMMY-X

8 Web Scraping & Crawling Tools mit n8n-Anbindung (Workflow-Vorlage zum kostenlosen Download)

Wir schauen uns heute an, wie ihr Web Scraping und Crawling betreiben könnt. Dazu schauen wir uns 8 verschiedene Tools an und verbinden diese auch direkt mit n8n, damit ihr die extrahierten Daten in einem Workflow weiter verarbeiten könnt.

https://www.youtube.com/watch?v=LP571gnIg7A

#n8n #ki #automatisierung #webscraping #webcrawler #webscraper

8 Web Scraping & Crawling Tools mit n8n-Anbindung (Workflow-Vorlage zum kostenlosen Download)

YouTube