@b0rk I hit a lot of brick walls when trying to teach myself all sorts of computing topics, and the one I think of most often is the way "Kids and the Commodore 128" presented FOR loops.
They said it would count some variable from one number to another, and gave an example of how this could be used to slow a program down by inserting do-nothing FOR loops in the middle of a GOTO loop. As a result, my gradeschooler brain refused to accept the real utility of the things: they'd just poisoned my conception of the construct with a terrible framing.
I think the biggest question I had when encountering new topics was always rooted in "But *why* do we need this thing? What is it actually used for?" I needed examples of sequence processing that weren't just toys.
The O'Reilly Python book was kind of a dud, because it taught all the language features through the lens of some kind of FTP file-uploader program that, well...it didn't do anything a good ftp client didn't already handle. It was extremely difficult to care about the unfolding narrative of the large single example program the book was built around.
By contrast, the O'Reilly Learning Perl book was phenomenal at teaching the language through little ten-line examples that did amazing and useful things. I stopped bouncing off Unix concepts when I read that book, because it taught me the ways in which Perl smashed a bunch of Unix conventions into one big interpreter (which I first used on DOS, for funny historical reasons). I later took what I learned into a professional Linux career.
I think Learning Perl and the Unix-Haters Handbook were probably the two most important books for actually teaching me what Unix was about. The latter gave me more context about the way Unix was designed than any document trying to advertise it to me!