What are the craziest misconceptions you’ve heard about programming from people not familiar with it?

https://lemmy.ml/post/12422626

What are the craziest misconceptions you’ve heard about programming from people not familiar with it? - Lemmy

As someone who spends time programming, I of course find myself in conversations with people who aren’t as familiar with it. It doesn’t happen all the time, but these discussions can lead to people coming up with some pretty wild misconceptions about what programming is and what programmers do. - I’m sure many of you have had similar experiences. So, I thought it would be interesting to ask.

The worst and most common misconception is that I can fix their Windows issues from a vague description they give me at a party.
Isn’t the solution to send them this link? ;-)
distrowatch.com/dwres.php?resource=major
DistroWatch.com: Put the fun back into computing. Use Linux, BSD.

News and feature lists of Linux and BSD distributions.

Won’t solve their problem, but they won’t be your friend anymore :)
This does solve the problem for you though
You Don't Win Friends with Salad Linux
Windows bad. Linux good. No need for nuance, just follow the hivemind.
Do you have a few minutes to talk about our Lord and Saviour, Linus Torvalds?

Don’t you mean Linux Torvalds?

that hurt to type

Oh you have an issue on Linux? Just try a different distro
How to download and install Linux

Download and install Linux in this tutorial that covers how to choose a distribution, how to use the install command with Windows Subsystem for Linux, create a bootable USB for Bare-metal, or set up a Virtual Machine.

Or some stupid Facebook “””issue”””
At least that’s an easy one, you just convince them to delete their account. \s
Lol! My mum still asks both me and my husband (“techy” jobs according to her) to solve all her problems with computers/printers/ the internet at large/ any app that doesn’t work… the list is endless. I take it as a statement of how proud she is of me that she would still ask us first, even if we haven’t succeeded in fixing a single issue since the time the problem was an old cartridge in the printer some 5-6 years ago.
My answer: "I don't play Windows".
My favorite is "and there was some kind of error message." There was? What did it say? Did it occur to you that an error message might help someone trying to diagnose your error?

What did it say?

I’ve met people who legitimately did not understand this question.
“What do you mean, what did it say? I clicked on it but it still didn’t work.”

Then you set up an appointment to remote in, ask them to show you what they tried to do, and when the error message appears, they instantly close it and say “See, it still doesn’t work. What do we even pay you for?”
I’ve had remote sessions where this was repeated multiple times, even after telling them specifically not to close the message. It’s an instinctive reflex.

Or it won't happen when you're watching, because then they're thinking about what they're doing and they don't make the same unconscious mistake they did that brought up the error message. Then they get mad that "it never happens when you're around. Why do you have to see the problem anyway? I described it to you."
When that happens, I’m happy. Cause there is no error when the task is done right.
I mail them a quick step-by-step manual with what they just did while I watched.
When the error happens the next time I can tell them to RTFM and get back to me if that doesn’t solve the issue.
The notion that creating a half-decent application is quick and easy enough that I would be willing to transform their idea into reality for free.

That’s absolutely true. What’s hard and what’s easy in programming is so completely foreign to non-programmers.

Wait, you can guess my password in under a week but you can’t figure out how to pack a knapsack?

I’m pretty sure that government software always blows because they think software can be written according to a fixed schedule and budget

It’s tempting to think it’s like building a house, and if you have the blueprints & wood, it’ll just be fast and easy. Everything will go on schedule

But no, in software, the “wood” is always shape shifting, the land you’re building on is shape shifting, some dude in Romania is tryna break in, and the blueprints forgot that you also need plumbing and electric lines

Well, that’s probably true for the most part but by far the reality is that it comes down to lowest bidder 9/10 times. Unrealistic budgets and unrealistic time frames with a cheap labor they can find gets you a large amount of government funded projects throughout all the years.

One of the most common problems of government or other big organisation software is that they don't scale, either "not well" or "not at all".

Some guy hacks up a demo that looks nice and seems to do what customer wants, but then it turns out a) that it only allows for (number of open ports on one machine) users at the same time, and b) it only works if everything runs on one machine. Or worse, one core.

It’s tempting to think it’s like building a house, and if you have the blueprints & wood, it’ll just be fast and easy. Everything will go on schedule

it never goes according to schedule eve if there is blueprint & wood

I have a hypothesis that a factor is that government needs to work for everyone.

A private company can be like “we only really support chrome”, but even people running ie6 at a tiny resolution need to renew their license.

I believe this is usually covered by the fact that you can do just about anything you need to do over mail. I once ran into a government site that only worked on Edge.
Building a house (or any construction project) is notoriously impossible to be on schedule and on budget too.
What’s worse is them insisting that you build it in Rust and Mongodb only.
I once had this and ended up paying for the meeting room cuz he was broke.
I once had a friend who told me, that he finds it interesting that I think and write in 1s and 0s.
Every Hollywood programmer
Confuse him with “I used to do that but I’m nonbinary now”
Oh, my gender, sexuality, and base are also quantum
That might be true of VHDL / Verilog programmers I guess.
The speed at which it takes to make something. We had a vulnerability with a JavaScript library in an old app that I do minimal support on, I said that it only uses like 3 or 4 libraries, so depending on what it is the whole frontend may need to be re-written. IT: “Ok well we have to get that expensed.” Sure bro let me just bill the client that is paying for it and error support 20k for new dev time. Nah, the fix is gonna have to be a workaround on your end, we do not have the bandwidth and they don’t have the capital.

Nah, the fix is gonna have to be a workaround

Ah, yes. The “do nothing but cross your fingers and pray it doesn’t bite you in the ass” workaround.

No, one part of the fix would be an access policy limited to their network or via VPN. Security is part IT and part dev
A lot people compleatly overrate the amount of math required. Like its probably a week since I used a aritmetic operator.

Sometimes when people see me struggle with a bit of mental maths or use a calculator for something that is usually easy to do mentally, they remark “aren’t you a programmer?”

I always respond with “I tell computers how to do maths, I don’t do the maths”

Which leads to the other old saying, “computers do what you tell them to do, not what you want them to do”.

As long as you don’t let it turn around and let the computer dictate how you think.

I think it was Dijkstra that complained in one of his essays about naming uni departments “Computer Science” rather than “Comput_ing_ Science”. He said it’s a symptom of a dangerous slope where we build our work as programmers around specific computer features or even specific computers instead of using them as tools that can enable our mind to ask and verify more and more interesting questions.

The scholastic discipline deserves that kind of nuance and Dijkstra was one of the greatest.

The practical discipline requires you build your work around specific computers. Much of the hard earned domain knowledge I’ve earned as a staff software engineer would be useless if I changed the specific computer it’s built around - Android OS. An android phone has very specific APIs, code patterns and requirements. Being ARM even it’s underlying architecture is fundamentally different from the majority of computers (for now. We’ll see how much the M1 arm style arch becomes the standard for anyone other than Mac).

If you took a web dev with 10YOE and dropped them into my Android code base and said “ok, write” they should get the structure and basics but I would expect them to make mistakes common to a beginner in Android, just as if I was stuck in a web dev environment and told to write I would make mistakes common to a junior web dev.

It’s all very well and good to learn the core of CS: the structures used and why they work. Classic algorithms and when they’re appropriate. Big O and algorithmic complexity.

But work in the practical field will always require domain knowledge around specific computer features or even specific computers.

I think Dijkstra’s point was specifically about uni programs. A CS curriculum is supposed to make you train your mind for the theory of computation not for using specific computers (or specific programming languages).

Later during your career you will of course inevitably get bogged down into specific platforms, as you’ve rightly noted. And that’s normal because CS needs practical applications, we can’t all do research and “pure” science.

He has a rant where he’s calling software engineers basically idiots who don’t know what they’re doing, saying the need for unit tests is a proof of failure. The rest of the rant is just as nonsensical, basically waving away all problems as trivial exercises left to the mentally challenged practitioner.

I have not read anything from/about him besides this piece, but he reeks of that all too common, insufferable, academic condescendance.

He does have a point about the theoretical aspect being often overlooked, but I generally don’t think his opinion on education is worth more than anyone else’s.

Article in question: www.cs.utexas.edu/~EWD/…/EWD1036.html

E.W. Dijkstra Archive: On the cruelty of really teaching computing science (EWD 1036)

Sounds about right for an academic computer scientist, they are usually terrible software engineers.

At least that’s what I saw from the terrible coding practices my brother learned during his CS degree (and what I’ve seen from basically every other recent CS grad entering the workforce that didn’t do extensive side projects and self teaching) that I had to spend years unlearning him afterwards when we worked together on a startup idea writing lots of code.

We must do different sorts of programming…

There’s a wide variety of types of programming. It’s nice that the core concepts can carry across between the disparate branches.

If I’m doing a particular custom view I’ll end up using sin cos tan for some basic trig but that’s about as complex as any mobile CRUD app gets.

I’m sure there are some math heavy mobile apps but they’re the exception that proves the rule.

You should probably use matrices rather than trig for view transformations. (If your platform supports it and has a decent set of matrix helper functions.) It’ll be easier to code and more performant in most cases.

I mean I’m not sure how to use matrices to draw the path of 5 out of 6 sides of a hexagon given a specific center point but there are some surprisingly basic shapes that don’t exist in Android view libraries.

I’ll also note that this was years ago before android had all this nice composable view architecture.

Hah, yeah a hexagon is a weird case. In my experience, devs talking about “math in a custom view” has always meant simply “I want to render some arbitrary stuff in its own coordinate system.” Sorry my assumption was too far. 😉

Yeah it was a weird ask to be fair.

Thankfully android lets you calculate those views separately from the draw calls so all that math was contained to measurement calls rather than calculated on draw.

At the same time, I find it amazing how many programmers never make the cognitive jump from the “playing with legos” mental model to “software is math”.

They’re both useful, but to never understand the latter is a bit worrying. It’s not about using math, it’s about thinking about code and data in terms of mapping arbitrary data domains. It’s a much more powerful abstraction than the legos and enables you to do a lot more with it.

For anybody who finds themselves in this situation I recommend an absolute classic: Defmacro’s “The nature of Lisp”. You don’t have to make it through the whole thing and you don’t have to know Lisp, hopefully it will click before the end.

defmacro - The Nature of Lisp

An article about Lisp programming language.

the “playing with legos” mental model

??

Function/class/variables are bricks, you stack those bricks together and you are a programmer.

I just hired a team to work on a bunch of Power platform stuff, and this “low/no-code” SaaS platform paradigm has made the mentality almost literal.

I think I misunderstood lemmyvore a bit, reading some criticism into the Lego metaphor that might not be there.

To me, "playing with bricks" is exactly how I want a lot of my coding to look. It means you can design and implement the bricks, connectors and overall architecture. If running with the metaphor, that ain't bad, in a world full of random bullshit cobbled together with broken bricks, chewing gum and exposed electrical wire.

If the whole set is wonky, or people start eating the bricks instead, I suppose there's bigger worries.

(Definitely agree on "low code" being one of those worries, though - turns into "please, Jesus Christ, just let me write the actual code instead" remarkably often. I'm a BizTalk survivor and I'm not even sure that was the worst.

My take was that they’re talking more about a script kiddy mindset?

I love designing good software architecture, and like you said, my object diagrams should be simple and clear to implement, and work as long as they’re implemented correctly.

But you still need knowledge of what’s going on inside those objects to design the architecture in the first place. Each of those bricks is custom made by us to suit the needs of the current project, and the way they come together needs to make sense mathematically to avoid performance pitfalls.

I think you are irresponsible towards your future if you are a gainfully employed self-taught programmer, and don’t invest in formal education. If you say ‘I don’t have time!’ well, consider this, even night classes in junior colleges teach you stuff you don’t know. Go to them, get your associates. I am in the process of getting into a contract where I do some thankless jobs for someone I know, he in exchange pays me to go to college. I am 31 – as I said in the other thread, THERE IS NOTHING WRONG WITH BEING A LATE-COLLEGER!

I have been to college, I have studied 3 subjects for a total of 9 semesters, I have no degree to show for any of them :( I quit English lit, French lit and “Programming” all after 3 semesters. But when I was studying French lit, there was a guy in our class who was SIXTY-FIVE YEARS OLD! He wanted to learn French to open up some a commerce consulting office, focusing on import/export from France.

What I wanted to do was to ‘write’, keep in mind, ‘write’, not ‘draw’ bande dessine! But now that I am older and hopefully wiser, I have a set goal in mind. I am going to go this ‘boutic’ college near our home to study Electronics Engineering and when push comes to shove and China makes its move, start a chipset engineering firm with a production wing.

Just like how electronics is math with physics, programming is the virtual aspect of it. it’s ‘applied math’. I understand enough discmath because I studied enough of it both in college, and high school (since I was math/physics elective) so I have managed to understand some very rough and old papers.

You can always self-study if you haven’t got the time. Here’s a book which is kind of a meme, but it’s still very good: fuuu.be/…/Introduction-To-The-Theory-Of-Computati…

This is the 2nd edition though, 3rd is out — I think 4th is coming. The best goddamn book, regardless of its meme status.

Read that knowing nothing of lisp before and nothing clicked tbh.

When talking about tools that simplify writing boilerplate, it only makes sense to me to call them code generatiors if they generate code for another language. Within a single language a tool that simplifies complex tasks is just a library or could be implemented as a library. I don’t see the point with programmers not utilizing ‘code generation’ due to it requiring a external tools. They say that if such tools existed in the language natively:

we could save tremendous amounts of time by creating simple bits of code that do mundane code generation for us!

If code is to be reused you can just put it in a function, and doing that doesn’t take more effort than putting it in a code generation thingy. They preach how the xml script (and lisp I guess) lets you introduce new operators and change the syntax tree to make things easier, but don’t acknowledge that functions, operator overriding etc accomplish the same thing only with different syntax, then go on to say this:

We can add packages, classes, methods, but we cannot extend Java to make addition of new operators possible. Yet we can do it to our heart’s content in XML - its syntax tree isn’t restricted by anything except our interpreter!

What difference does it make that the syntax tree changing depending on your code vs the call stack changing depending on your code? Of course if you define an operator (apparently also called a function in lisp) somewhere else it’ll look better than doing each step one by one in the java example. Treating functions as keywords feels like a completely arbitrary decision. If anything, the parenthesis seem to be the operator that shapes the syntax tree while the functions just determine what happens at each step like any other function. And even going by their definition, I like having a syntax that does a limited number of things in a more visually distinct way more than a syntax does limitless things all in the same monotonous way.

Lisp comes with a very compact set of built in functions - the necessary minimum. The rest of the language is implemented as a standard library in Lisp itself.

Isn’t that how every programming language works? It feels unfair to raise this as an advantage against a markup language.

Data being code and code being data sounded like it was leading to something interesting until it was revealed that functions are a seperate type and that you need to mark non-function lists with an operator for them to not get interpreted as functions. Apart from the visual similarity in how it’s written due to the syntax limitations of the language, data doesn’t seem any more code than evaluating strings in python. If the data is valid code it’ll work, otherwise it won’t.

The only compelling part was where the same compiler for the code is used to parse incoming data and perform operations on it, but even that doesn’t feel like a game changer unless you’re forbidden from using libraries for parsing.

Finally I’m not sure how the article relates to code being math neither. It just felt like inventing new words to call existing things and insisting that they’re different. Or maybe I just didn’t get it at all. Sorry if this was uncalled for. It’s just that I had expected more after being promised enlightenment by the article

This is a person that appears to actually think XML is great, so I wouldn’t expect them to have valid opinions on anything really lol
On the other hand in certain applications you can replace a significant amount of programming ability with a good undertstanding of vector maths.
Negl I absolutely did this when I was first getting into it; especially with langs where you actually have to import something to access “higher-level” math functions. All of my review materials have me making arithmetic programs, but none of it goes over a level of like. 9th grade math, tops. (Unless you’re fucking with satellites or lab data, but… I don’t do that.)