Latest Significant World Earthquake:
2026/02/22 at 16:57
Location: SABAH,MALAYSIA
Depth: 619 km
Strength: 7.1
#bot #scrape https://earthquakes.bgs.ac.uk/earthquakes/recent_world_events.html
Latest Significant World Earthquake:
2026/02/22 at 16:57
Location: SABAH,MALAYSIA
Depth: 619 km
Strength: 7.1
#bot #scrape https://earthquakes.bgs.ac.uk/earthquakes/recent_world_events.html
Latest Significant World Earthquake:
2026/02/22 at 16:57
Location: SABAH,MALAYSIA
Depth: 619 km
Strength: 7.1
#bot #scrape https://earthquakes.bgs.ac.uk/earthquakes/recent_world_events.html
AI firms are quietly scraping the web and millions of books without permission to train massive LLMs. The hidden data harvest fuels ever‑larger parameter counts and token pools, raising legal and ethical questions for the open‑source community. How should we protect authors and keep generative AI transparent? Read the full story. #AI #LLM #scrape #generativeAI
🔗 https://aidailypost.com/news/ai-firms-scrape-web-millions-books-without-permission-fuel-llms
🔍 / #software / #browser / #extension
#Scrape any website with one click
Australia’s Antarctic icebreaker RSV Nuyina ‘makes contact with ocean floor’ near remote Heard Island
Australia’s icebreaker has been involved in an incident in which the ship’s hull “made contact with the ocean…
#NewsBeep #News #Headlines #aad #antarctic #AU #Australia #flood #heardisland #icebreaker #nuyina #ocean #program #runaground #scrape #seafloor #tas #tasmania
https://www.newsbeep.com/184116/
#Landlords Demand Tenants’ #Workplace #Logins to #Scrape Their #Paystubs
Landlords are using a service that logs into a potential renter’s #employer systems and scrapes their paystubs and other information en masse, potentially in violation of U.S. #hacking laws, according to screenshots of the tool shared with 404 Media.
#privacy #security #payroll #tenants
https://www.404media.co/landlords-demand-tenants-workplace-logins-to-scrape-their-paystubs/
pg_fetch_cycle is an experimental PostgreSQL extension built on top of pg_net that orchestrates multi-step HTTP request sequences. It enables things like paginated API scraping, request chaining, and stateful HTTP workflows.
I know I am going to need this for something, someday.
https://www.androidpolice.com/google-sheets-how-to-scrape-data/
It bothers me when people and organizations back the "stealing content to train 'AI' is fair use" argument. To me, it seems pretty clearly *not* fair use. But these orgs frequently back their positions with something along the lines of "But AI can be a positive, ethical force, we have to be able to create ethical AI".
I don't have a problem acknowledging that ethical uses of "AI" are possible.
I have a problem with "AI" backers not acknowledging that the only people funding "AI" have no interest in ethical uses of it.
#AI #ethics #ethical #FairUse #training #LLM #BigTech #scrape #slop #theft #copyright #funding