i spent too much time on this

@vildis +9001%

Why they don't go with tar or pax or plain bzip2 / xz is beyond me.

#tech #shitpost #tar #pax #bzip2 #bunzip2 #xz #compression

@kkarhan @vildis It's legacy stuff from the usenet days. One file being broken allows you to just download missing blocks for it from other usenet providers so that you don't have to redownload the entire thing all over again because one piece is missing/broken on your provider. Most tools these days basically make the rar files essentially transparent anyways, so you never really need to worry about it...?
@jay @kkarhan @vildis you just reminded me of PAR files
@kajer @kkarhan @vildis Yep, still use them to this day (entirely transparently to me).
×
i spent too much time on this

@vildis +9001%

Why they don't go with tar or pax or plain bzip2 / xz is beyond me.

#tech #shitpost #tar #pax #bzip2 #bunzip2 #xz #compression

@kkarhan @vildis It's legacy stuff from the usenet days. One file being broken allows you to just download missing blocks for it from other usenet providers so that you don't have to redownload the entire thing all over again because one piece is missing/broken on your provider. Most tools these days basically make the rar files essentially transparent anyways, so you never really need to worry about it...?
@jay @kkarhan @vildis you just reminded me of PAR files
@kajer @kkarhan @vildis Yep, still use them to this day (entirely transparently to me).
@jay @kkarhan@infosec.space @vildis right but torrents literally do the block integrity check and redownload as part of the protocol and infohash, with even better granularity. and if you want to seed you have to keep the split rars *and* the unpacked media, so it takes up double the disk space.
@gsuberland @vildis What you're seeing there are people downloading from usenet and just uploading to torrent sites then. There's very little point in uploading split rars to torrent sites as you point out (though there are video apps out there that allow you to transparently play media directly from split rar files) without needing to extract.
@jay @vildis yup, exactly. I know all the scene history, it's just very annoying when people continue to do this. most of it comes down to the idea that it allows you to verify the split rar hashes against the release group's hashes to prove it wasn't tampered with during the reupload, but nobody ever does that anyway, especially on TV/movies, and it's a huge headache.
@jay @vildis there's also a bunch of nonsense people get up to like someone uploading to a private group as a split rar, then another group getting hold of it and doing a public rerelease, except of course they want to bundle their group's NFO with it. but they can't do that without changing the rar hashes and ruining the Purity™. so what do they do? they pack the split rars into a *new* set of split rars, just to add the nfo/diz file. and then someone uploads that verbatim as a torrent.
@jay @vildis so you get utterly ludicrous situations like a torrent of a split rar of a split rar of an iso file and you need triple the disk space free for unpacking if you want to seed.

@gsuberland @jay @vildis now I'm wondering if this layering of junk might be by accident helping in one way:

if there's a dozen unrelated torrent uploads of the same set of split rars, and none of them are completely seeded, you may be able to mix and match the rar files from all the partial torrent downloads?

Caution, tangent

One thing I personally benefitted from a lot when it comes to rars, back in the floppy disk era, was the ability to create rars with customizable 5/10/20% redundancy of data so it could self heal a few bad sectors away.

It's a feature I wish more backup/archive tools would have today still.

But that's not at all helpful for any kind of internet downloads nowadays (and hasn't been for a long time).

@gsuberland @jay @vildis same for disk files, which even takes more spaces. i just don’t get why people do that

@jay @kkarhan @vildis not usenet, but private ftp servers running a stack like glftpd and pzs-ng which check files against sfv checksum on upload

i’d still agree it’s “legacy stuff” though, these standards were originally set in the late 1990s when iso releases started becoming a thing. before then everything was in zips as a holdover from the BBS days with BBS software that would pkunzip -t on upload.

glFTPd :: We make files transfer!

glFTPd - A great FTP server for anyone on a *NIX platform.

@jay @kkarhan @vildis
my real world use for splitted rar archives: one Rxx per floppy disk! (i am NOT kidding! Did this for years, since i purchased my first cd burner late, around 1998. and didn't have broadband before 2000.)
#retrocomputing

@adorfer @jay @vildis I KNOW THAT!

  • I literally did that myself 30+ years ago when I had a handed-down #Windows95 T H I C C "pizzabox" which didn't have a CD-Burner, only a CD-ROM and even if I had CD-RWs back in 1999, they would've been more expensive than a 10-pack of 3,5" 1440kB FDDs and it would've been wasteful for stuff in the 2-10 MB region.

Granted #RAR is outdated and even if we assume we've to split files still to systematically abuse free tiers of file hosters (which I disapprove of because it's not only a bit antisocial but also susceptible to #LinkRot), there are now better options like #7zip that are even easier to use than tar & bzip2 & xz

@jay @kkarhan @vildis It's not so much "legacy" as it is that Usenet continues to be where a lot of pirated stuff originates.

Segmenting a file lets you check individual parts of a large download for errors while other parts are continuing to download, and then can redownload any corrupt segments. Bittorrent does this inherently, but other download methods don't.

Assuming the people making the things know what they're doing, they'd turn off compression entirely and just segment.

@StarkRG @jay @vildis personally I'd rather still compress stuff for speed reasons, as bandwith is the limiting factor, not computational power or storage.

  • That being said that too can be done by setting compression factor to 0.

https://infosec.space/@kkarhan/114789676621551304

I don't host, seed, download or distribute anything copyrighted, but like with @vxunderground and @VXShare do for #Antivirus I've seen enough groups also do some basic #encryption as to avoid any #ContentID-style matching and thus automated #DMCA takedowns...

Kevin Karhan :verified: (@kkarhan@infosec.space)

@adorfer@chaos.social @jay@social.zerojay.com @vildis@infosec.exchange *I KNOW THAT!* - I literally did that myself 30+ years ago when I had a handed-down #Windows95 *T H I C C "pizzabox"* which didn't have a CD-Burner, only a CD-ROM and even if I had CD-RWs back in 1999, they would've been more expensive than a 10-pack of 3,5" 1440kB FDDs and it would've been wasteful for stuff in the 2-10 MB region. Granted #RAR is outdated and even if we assume we've to split files still to *systematically abuse free tiers of file hosters* (which I disapprove of because it's not only a bit antisocial but also susceptible to #LinkRot), there are now better options like #7zip that are even easier to use than `tar` & `bzip2` & `xz`…

Infosec.Space

@kkarhan @jay @vildis

Video is already compressed. Using lossless compression on already lossy compressed data won't get you a smaller file. It's also why transmitting compressed video over a compressed connection ends up seeming slower than transmitting non-compressed data, the actual data rate remains the same, but you're not getting any benefit out of the compression.

I agree that segmentation is unnecessary for bittorrent, but many torrents are just straight copies of usenet posts.

@kkarhan @StarkRG @jay @vildis @vxunderground The simple encryption applied to the torrents here is to prevent anti-malware from triggering and deleting the malware a researcher is trying to download. Added benefit is the compression of the zip file because for some bandwidth and storage *is* an issue.

@VXShare @StarkRG @jay @vildis @vxunderground OFC, if their corporate firewall didn't blocklist your domain, most #MITM-based "#NetworkSecurity" solutions and "#EndpointProtection" will checksum files and instantly yeet them into the shadow realm.

  • Researchers should OFC only run those said malware only for research purposes and on #airgapped, sanctioned systems but they need to get their hands on them in the first place.

And lets be honest: Like with chemistry and medicine, one wants to have a supplier that isn't shady af but actually transparent.

  • The "alternative" would be to go into some "dark corners" and risk getting something else entirely.
@vildis @kkarhan Using file sharing hosts for movies is illegal only for the uploader. BitTorrent and IPFS are illegal for the downloader as well, because of how those things work.
(IANAL, but that’s how it works in Germany)

@kkarhan @vildis folks were rar'ing the matrix divx on windows 95 machines back then. 👴🏻

projects, software, and standards around 7z and xz stuff wasn't around just yet timeline wise. ✨

@nicksilkey @vildis yes, for ancient stuff that counts, but not new releases!
@kkarhan @vildis couldn't tell ya how the scene all works now. but im sure its fascinating to study and understand, as it always was. 💃
@vildis Okay, .tar.001-.tar.nnn it is then...
@vildis movies should only be distributed as uncompressed jp2k
@vildis I can get behind this campaign
@vildis Don’t forget the RARs with password protection, like who was that for
@MisterMoo @vildis for people who had all the RAR part files, probably the uploader only
@vildis I'm old enough to have split a file across multiple floppy disks. 😅
@TranshumanBlues @vildis I also remember rar's spiritual predecessor, arj.

@vildis Parts were for slow unreliable connections back in the day, so you downloaded them one by one and if one failed you just did that piece not whole huge file. When you had it all, there was integrity check when combined that wouldn't be there otherwise and it would tell you which part was corrupted so you just got that one again.

It wasn't for compression purposes, it was for delivery purposes. No one compresses them these days.

@rejzor @vildis yessir. windows didn't have /usr/bin/split to work with back in the day. so we made made it work with what we had including standards to optimize for outcomes.

haha. syndicacy, baby! #thescene #warez950 #efnet4eva ✌️💙

@rejzor
Honorable mention:
7PLUS by DG1BBQ (around 1993)
file converter/encoder for error correction via store&forward networks, automatic generation of resend-requests for partial files/diffs.
https://github.com/hb9xar/7plus

@vildis
#amateurradio

GitHub - hb9xar/7plus: 7PLUS - file converter for store & forward

7PLUS - file converter for store & forward. Contribute to hb9xar/7plus development by creating an account on GitHub.

GitHub

@vildis

But it's 100% spot on and totally accurate 🙂👍

@vildis I had spats with some people over this. I requested them to block rar torrents, they said it's good for saving space! But it saves almost no space when used with video content. I gave up and installed/setup unpackerr

@vildis real use-case for splitting: transferring large files from A to B either over a very crappy network link or via physical media much smaller than the original file. Think diskettes and sub-1Mbit networks over sub-par phone networks.

The problem with pirates is they have a way of doing tihings without understanding why or where that way came from.

TL;DR - Pirates are ignorant at best and could do well to learn their own history.

@vildis It is lovely. Thank you.

@vildis ... compression? rars? ...

do you mean like a house cat?
here, I have your compressed rars.

@vildis One reason for splitting is that JDownloader doesn’t support Mega.nz files above 5 GB. But yes, RAR is the wrong format.

@vildis@infosec Couple of years back I got a work related data dump from an colleague who had sent a multi-part rar across multiple emails.

Once combined, there was a single zip inside, which contained a single data file.

Progress.

@vildis this is only needed for file hosters / the direct download scene
@alina @vildis Even for that wouldn't 7z still be better
in my experience RAR just plain sucks ass
Like yeah I know winrar demented brains but its been a long time can't we use better shit
@MarkAssPandi @vildis i have never seen a 7z split archive
@alina @vildis I am almost 100% sure it's supported tho
Like I never saw it in the wild either but I know it has splitting
@alina @vildis Fun fact I remember this mostly because I saw one person that claimed 7z is for splitting big files
@vildis tyvm for doing the lords work. A++++++
@vildis
> stop doing rars
Rawr!

@vildis @gnomon

The only notable reason to split files was because of underlying filesystem limitations. Sure, split that huge tarball across a dozen 3½" floppies. Or that FAT filesystem with the 2GB file limit.

But modern disk-sizes and file-systems? No good reason to split.

@gumnos @vildis @gnomon actually there is one: sharing files when there's a size limit like telegram with 2gb per file
@vildis Wanted to split big files into chunks anyway for a laugh, we had a tool for that, it was called BIT TORRENT
@vildis @benbe aren't torrents just 100s of thousands of tiny rar-like chunks in a trench coat? 😉
@platymew Nope, torrents are lists of check sums for each block; no compression.
@benbe meh, kinda questionable how loss-lessly compressible already lossyly compressed data is. 😏
@vildis You can use tar to split or join files in an archive. It's compression is optional and it's not just one compression algorithm.