Computers aren't so very fast and there are lots of layers. If programmers don't pay attention, computers can easily grind to a crawl. It's one thing to have some lazy programming on top and have it still feel plenty fast, but if all the layers underneath were programmed lazily, it wouldn't bo pretty.
The logic of your comment proves too much. Consider a good language that only has a few users because it is brand new. One argues "If this programming language was good people would be using it." and decides against adopting it. So not even a good new language can gain users.
where computation is almost free, memory size is almost unlimited (although programmers' ingenuity in creating bloated software apparently knows no bounds)
I love the contradiction in one sentence. Other than that there are a few good points and a bad one including his criticizing the use of machine based integers.
That one is particularly egregious. I do not want to do my work on abstract number types that would probably support rational and arbitrary precision. I want to do my work on specific sized data because it is audio or video. I want that work to be fast since I am doing more than gigabit data rates.
Too many functions and types is right. I'm looking at you Python. I was recently forced to use you for something that was proving a lit too difficult in plain shell.
I left this tab open far too long without posting this.
10 comments
0 u/varialus 12 Feb 2020 07:57
Computers aren't so very fast and there are lots of layers. If programmers don't pay attention, computers can easily grind to a crawl. It's one thing to have some lazy programming on top and have it still feel plenty fast, but if all the layers underneath were programmed lazily, it wouldn't bo pretty.
0 u/Gumbatron 12 Feb 2020 09:37
Not sure what his programming language is like, but this guy sounds like an irritating sperg. Everybody else is wrong and I'm right, hur da hur!
If his programming language was so good people would be using it .
0 u/Muh-Shugana 12 Feb 2020 16:41
Not saying he's right, but we all damn well know that things that are good aren't used these days.
0 u/MXIII 13 Feb 2020 08:17
Like what? You are being vague. I am not knocking you.
0 u/feral-toes 12 Feb 2020 19:26
The logic of your comment proves too much. Consider a good language that only has a few users because it is brand new. One argues "If this programming language was good people would be using it." and decides against adopting it. So not even a good new language can gain users.
0 u/BakedMofoBread 12 Feb 2020 17:40
It takes an order of magnitude less mental energy to define a problem compared to solving said problem.
0 u/Deceneu 16 Feb 2020 15:25
"solving said problem" is mostly an exercise to "define a problem".
0 u/feral-toes 12 Feb 2020 19:23
That seems like the worst page to start at. You should have linked to the introduction or the index
You've linked to the Grumpy Old Man's Appendix. That is cruel :-(
0 u/Deceneu 16 Feb 2020 15:27
I liked it. It felt grumpy, it piqued my interest, currently reading the whole thing.
0 u/J_Darnley 13 Feb 2020 23:32
I love the contradiction in one sentence. Other than that there are a few good points and a bad one including his criticizing the use of machine based integers.
That one is particularly egregious. I do not want to do my work on abstract number types that would probably support rational and arbitrary precision. I want to do my work on specific sized data because it is audio or video. I want that work to be fast since I am doing more than gigabit data rates.
Too many functions and types is right. I'm looking at you Python. I was recently forced to use you for something that was proving a lit too difficult in plain shell.
I left this tab open far too long without posting this.