Where the fuck did we go so wrong with modern software? (Tantrum) -

Smaug's Smokey Hole

no corona
kiwifarms.net
Software gets bloated because hardware is too good so there's no one around to crack the whip. Ten years ago you could buy a laptop with a Core i5 and 8GB of RAM for under $1000. Today you can buy a Core i5 with 8GB of RAM for under $1000. A lot of computers still ship with 8GB of memory which is fine for a normal user, it is faster than it used to be but the amount is the same. Chrome and webapps have been rising like a dough in that memory space for many years now. It's absurd that the hottest new game rendering photorealistic SSS titties at 120fps is fine with 8GB of memory but using an online spreadsheet in Chrome might require the user to upgrade to avoid an aneurysm.
 

Goldman

PROTECKING THE LYFE CYCLE
kiwifarms.net
While I completely agree with your sentiment that corporations and monocultures would love to eradicate programming independence, what is happening right now is absolutely the fault of programmers (hobbyist or otherwise) trading control and independence for immediacy. It has never been easier to get the information and build the skills required to develop a project of any level at any depth. The fact that FOSS and software in general is in a state of decline is because programmers are actively refusing to care about the art of programming. I'm not talking about the scripting dilettantes who exclusively use glue languages and argue about the legitimacy of dynamic madness writ large -- this is a problem that extends all the way down to systems programmers. Look at all the new languages: monocultures with a single compiler and a common mantra: pointers are dangerous, you can't outsmart the compiler, and don't reinvent the wheel (NIH is the devil!) because you can't make an implementation better than X because X was made with Y authority figures. It's absolute horseshit, but the seduction of immediacy coupled with laziness gets you with things like Electron ubiquity or std::function usage.

In the future it's highly possible that the monocultures / corps will start to actively hide information, but for now the onus is entirely on the programmers to stop making deals with the devil for some fast action and get back to the "tedium of implementation" (viz., the art of programming). C/C++ are the only ideological bastions left wherein the programmer is trusted and free to go as low as they wish, and aren't monocultures (if there are any other languages, please let me know) so I'll use them as the example. Systems documentation for any major OS is a few search engine queries away; compilers are well documented and often OPEN SOURCE; tools like Compiler Explorer allow you to easily disassemble source code on any relevant compiler and on any sane hardware target; Intel supplies you with the entire set of x86-x64 for free (I don't know about AMD but I'm sure they do something like that); there are many open source graphics libraries available if you don't want to write your own; there are countless books on math for applications / game programming (usually with actual C++ examples) that teach you the core set of LinAlg / Calc you'll need to navigate through most situations (they'll even skip over those "nasty" proofs in case you were having trouble concentrating!); You can always start with a bit more abstraction via some framework like SFML, and then you can even make quick and dirty prototypes or tools with Dear ImGui at a speed that would make Electron blush (and they'll look a hell of lot better than base QT). The amount that a single, untrained person can achieve is insane provided they care enough to spend a bit of time developing skills. Learning the fundamentals might be a bit tedious or boring, but they're not hard. You don't have to to have the IEEE 754 memorized before you get a general sense about floating point representation and the various challenges and pitfalls.

The knowledge is not only out there to support any project idea at any scale, but it's practically begging people to read it and yet no one seems interested; the only thing left they could do is pay you to care. Safety nets are everywhere with places like stackoverflow. Any person with a pulse can download an entire, active professional game engine like Unreal and dissect the thing at their leisure to find out all sorts of tricks. Unreal has fantastic internal documentation in most areas and is a wonderful educational resource even if you never use it; you can also do the same with graphics libraries like OpenGL or even the framework wrappers like SFML. There has never been a time when so much information has been so easily accessible and just available for any programmer with a drive to take advantage of. You can even tear apart clang and LLVM and see how modern compilers are made. The mystery is no longer hidden behind closed source, proprietary fog. Talks from professionals sharing valuable developments in all fields from conventions like CPPCON are free and readily available. Now think of how easy it is to find other programmers with similar interests; it should be common for small groups of all skill levels to form to create elegant, fantastic, interesting, or strange FOSS competition everywhere. We should be in the middle of a software renaissance, yet instead we're mired in the sludge of glue languages and kernels that exist primarily to fulfill the insatiable malloc demands of browser tabs. At some point in the distant future, it really might be the fault of some shadow corporation; for now, though, the only reason programmers "can't" develop software independently is because they don't have the willpower to sit down and read some useful text in their ubiquitous browser.
That's very true, perhaps these big corps are just waiting with open arms.
Software gets bloated because hardware is too good so there's no one around to crack the whip. Ten years ago you could buy a laptop with a Core i5 and 8GB of RAM for under $1000. Today you can buy a Core i5 with 8GB of RAM for under $1000. A lot of computers still ship with 8GB of memory which is fine for a normal user, it is faster than it used to be but the amount is the same. Chrome and webapps have been rising like a dough in that memory space for many years now. It's absurd that the hottest new game rendering photorealistic SSS titties at 120fps is fine with 8GB of memory but using an online spreadsheet in Chrome might require the user to upgrade to avoid an aneurysm.
My CPU fan starts to sound like it's taking off when I'm in discord voice chat, is it actually doing so etching or is it that badly made?
 

Idiotron

The last sane person on Earth
kiwifarms.net
I've noticed that, for the past decade or so, new software has been more oriented towards user friendliness than functionality.
I'm fine with making the learning curve less steep but don't sacrifice the quality.
At the very least, give me a choice between a convenient/buggy piece of software or a demanding/functional one.

Another thing is that software companies have adapted the DLC model from video games.
You pay full price for the software and then, you pay over and over and over again to get features which should be there from the start.
You can also choose the subscription model and just pay endlessly for renting a product instead of 1 payment to own the damn thing (I'm looking at you, Adobe, I'll be using cracked versions of your shit because I refuse to give you $600 per year).
 

Harvey Danger

getting tired of this whole internet thing
kiwifarms.net
I blame the internet and internet-centered architectures for making this necessary. If your local code has a bug, it might crash and make you look like a retard. But if that code's facing the internet, that bug is going to turn into an RCE exploit and immediately get automated by script kiddies and spammed out to every IP address in existence. That naturally leads to a certain degree of risk-aversion: better to glue in some battle-tested library than to implement something yourself. Plus, if someone does find an exploit, you won't be the only one affected so it's not your fault!
Automated code metrics are a secondary culprit: replacing your own code with a library call is always rewarded.
This reminds me of the analysis a few of us did on the disastrous Iowa caucus app back in February (God that seems like so long ago).

How did the state party get brand new modern polling software deployed to the entire state for a mere $70k? By hiring two grad students out of Poland. How did they manage to "code" a super important widely deployed app in less than 3 months? By cobbling together Auth0 for logins, React for UI, and Google Cloud for data management.

And when it broke down on election night, the IT world gave a collective shocked_Pikachu_face.jpg and carried on without condemning modern coding practices. The lesson they "learned" was that caucuses themselves need to be eliminated, not fly-by-night dev shops who glue currently fashionable libraries/services together.

I've noticed that, for the past decade or so, new software has been more oriented towards user friendliness than functionality.
I'm fine with making the learning curve less steep but don't sacrifice the quality.
At the very least, give me a choice between a convenient/buggy piece of software or a demanding/functional one.
It's economics of scale. It's more profitable to sell something at a lower price to millions of casual users, than to sell something at a higher price to a smaller set of power users. And by "more profitable", I don't mean the profits are linearly higher; I mean they have the potential to go exponential.
 
  • Thunk-Provoking
Reactions: spiritofamermaid

Ledj

kiwifarms.net
I blame the internet and internet-centered architectures for making this necessary. If your local code has a bug, it might crash and make you look like a retard. But if that code's facing the internet, that bug is going to turn into an RCE exploit and immediately get automated by script kiddies and spammed out to every IP address in existence. That naturally leads to a certain degree of risk-aversion: better to glue in some battle-tested library than to implement something yourself.
Yeah, that's definitely a problem and the internet has very much complicated things. The standards/protocols being so bloated and needlessly complex very well might be some kind of nefarious obfuscation instead of just the usual incompetence, too. I hadn't really considered it there.

The NIH paranoia is good in very specific areas like cryptography but it's getting applied everywhere, even where code correctness / security isn't even a concern; performance gets cited all the time by standard zealots who claim <algorithm> is as fast as it can get or that <vector> is sufficient and performant for all purposes. Ask a specific implementation question in the Rust community and they'll instead tell you not to bother and give you a crates.io link instead. It's relying way too much the on zero-overhead abstraction principle, which I theoretically appreciate but dislike in practice because its entire premise is that you can't write a better version by hand; the problem is that general tools meant for all purposes can be only be generally optimized. I know far more about my specific context than does the compiler or any standard library (also what if I just don't want the dependencies??), so there while there are plenty of scenarios where something like <algorithm> is the right choice, there are equally many scenarios where writing one's own implementation is the right thing to do. (Also writing one's own implementation can actually be a fun and rewarding task in its own right, which shouldn't be discounted either.)

Discord is essentially a web browser without the ability to surf the wider web.
Discord is (surprise!) made via Electron.
 

Razel

Give me the dust of my father.
kiwifarms.net
I'm reaching here, but the Ribbon is really nice to use, but it's not anything new you can do from Office 2007 onwards that you couldn't before.
Oh yeah, the Ribbon is generally a pretty nice advancement, and I'd even allow that it was worthy of a proper version increment.

The fact that we're getting to where hobbyist devs can't create software independently is by design. Most people who use computers will be shuttled so consumption boxes, while those who show coding aptitude will be pushed to use libraries and code from those corporations, almost like a slightly more advanced consumer. Nothing will be done without the consent of the corporations. Unfortunately, coding will be more and more abstracted, and most devs will forget about code that isn't super high level or isn't just using existing stuff that will just create more dependencies.
I think that Ledj's post holds some water as well, but I do feel like corporate interests do work to advance the rate of decay.

While I completely agree with your sentiment that corporations and monocultures would love to eradicate programming independence, what is happening right now is absolutely the fault of programmers (hobbyist or otherwise) trading control and independence for immediacy. It has never been easier to get the information and build the skills required to develop a project of any level at any depth. The fact that FOSS and software in general is in a state of decline is because programmers are actively refusing to care about the art of programming.
[...]
In the future it's highly possible that the monocultures / corps will start to actively hide information, but for now the onus is entirely on the programmers to stop making deals with the devil for some fast action and get back to the "tedium of implementation" (viz., the art of programming). C/C++ are the only ideological bastions left wherein the programmer is trusted and free to go as low as they wish, and aren't monocultures (if there are any other languages, please let me know) so I'll use them as the example.
[...]
At some point in the distant future, it really might be the fault of some shadow corporation; for now, though, the only reason programmers "can't" develop software independently is because they don't have the willpower to sit down and read some useful text in their ubiquitous browser.
100% right, and indeed I'd say this is a problem with the tech sector on both consumption and production ends as a whole. My biggest concern is the way in which we so readily drop existing FOSS solutions for the corporate FOSS alternatives for a slight advantage in convenience. When's the last time anything meaningful came out from any of the Firefox forks? God, I wish there was a solid fork of pre-quantum Firefox that had the human resources behind it necessary to do away with inherent bugs/flaws/holes.
 

Unassuming Local Guy

Friendly and affectionate
kiwifarms.net
Software gets bloated because hardware is too good so there's no one around to crack the whip.
I think this is 90% of the issue.

Think about programming 30 years ago. The entire point was to squeeze as much power out of your limited resources as you could. Computers couldn't multitask for shit and had to be tricked into doing anything beyond math. If you wanted to make anything more complex than a calculator you had to minimize its footprint as much as humanly possible.

Modern computers can handle anything you throw at them, so there's no real incentive to optimize. Why spend a week cutting your average CPU usage from 20% to 15%? Who's going to notice? Of course, when every program is made with that philosophy in mind, things snowball very quickly considering how many simultaneous programs run on the average computer, but if you're a lazy asshole then what do you care? Nobody's going to notice until you're paid and gone.

I like the money analogy of hardware power. Someone who makes minimum wage is (in theory) going to live frugally, wasting nothing. Someone who makes 5 million a year is going to be pissing away so much money it'd make the average person sick. In the end, the latter person still has the most money, but the amount they waste is tens or hundreds of times the former. They just don't care.

Also, most companies would rather give you a month's pay for garbage code than two month's pay for good code because they're stupid and don't know any better.
 

JonesMcCannister

Who was dragged down by the stone
kiwifarms.net
We now have morons who call themselves "programmers" that churn out piles of shit. They have no understanding how their program interacts with the CPU and RAM, and they most certainly don't understand why this knowledge is important in the first place.
I'm working on becoming a developer (it's been a hobby for a long time) and I'm interested in what you said. Can you link any resources to learn more about how programs interact with the hardware?
 

Jack O'Neill

Fuck
kiwifarms.net
I'm working on becoming a developer (it's been a hobby for a long time) and I'm interested in what you said. Can you link any resources to learn more about how programs interact with the hardware?
You may want to check out Jonathon Blows game development series, or Molly Rocket's "Introduction to C on windows". They go into great detail on this. The videos are several hours long, but they are worth it.
 

spiritofamermaid

2 Commission Spots Left
True & Honest Fan
kiwifarms.net
lolwut - they seriously did that? Can you switch it back to 'Classic Mode' or something?

That's surely gonna fuck with so many designers' workflows, having to get used to a new UI, especially on an app that's as complex and multilayered with years of cruft like PS.

Now would be a good time to get your head around GIMP or whatever, if you've got to learn a whole other UI/workflow anyway.
Everything is still in the same place, just has different icons (material design ahoy!) and darker overall.

And no, people asked if they could switch back or have an add-on to use the old overlay and they said lolno. =(
 
  • Thunk-Provoking
Reactions: Sgt. Pinback

Smaug's Smokey Hole

no corona
kiwifarms.net
Probably a hell of a lot easier to do on say, a C64 or Amiga than it would be on a modern PC just using the bare metal.
It's easier on PC, use SDL and create a software surface/framebuffer and then only use the CPU to plot/draw the pixels directly it. A seemingly trivial task for modern hardware that will make the RAM catch fire trying to read/write tens of megabytes per second. After that it's time to optimize it in increments and change it into something that produces the same result but isn't performance poison, this part gets closer to the old C64 and Amiga because memory is still memory and CPUs are low functioning autists. It's a small(compiles in a second) and self-contained(all the code can be kept in the main file without becoming unwieldy) way to learn/see what different components are allergic to.

(using a windows GDI surface would be better because it's way more fucked from the outset, it's just so much faster to get a window up with SDL)
 
  • Thunk-Provoking
Reactions: Sgt. Pinback

Mr. Duck

kiwifarms.net
The rise of better internet connections and computers getting faster made developers lazy, and then the smartphones came around, forcing the industry to go full HTML5 and everything went to shit.

lolwut - they seriously did that? Can you switch it back to 'Classic Mode' or something?

That's surely gonna fuck with so many designers' workflows, having to get used to a new UI, especially on an app that's as complex and multilayered with years of cruft like PS.

Now would be a good time to get your head around GIMP or whatever, if you've got to learn a whole other UI/workflow anyway.
Just keep an older pirated copy around, i've been using CS6 for years now and i have no intention of getting the newer version since i have no use for whatever new features it has.

I still have a license that works for Photoshop on my Mac from the college I was attending although I suspect that's going away soon as I'm not going to enroll again until there's a vaccine, but, while I do use Gimphoto (GIMP with a Photoshop-like interface) on my HP, there are some things that are much easier to do in Photoshop.
That sounds interesting, i've avoided GIMP like the plague because i hate the interface, it's nice to have an alternative for when eventually, older copies of Photoshop stop working on Windows.
 
  • Feels
Reactions: Sgt. Pinback
Tags
None