i absolutely hate how the modern web just fails to load if one has javascript turned off. i, as a user, should be able to switch off javascript and have the site work exactly as it does with javascript turned on. it’s not a hard concept, people.
but you ask candidates to explain “graceful degradation” and they’ll sit and look at you with a blank stare.
I don’t know anything about web development but, is it really fair to say it should work exactly the same with JavaScript turned off? If that were achievable why would it be there in the first place? I assume the graceful degradation concept is supposed to be that as you strip away more and more layers of additional functionality, the core functions remain or at least some kind of explanation is given to the user why things don’t work.
People do stuff in JavaScript that you really don’t need JavaScript for. You don’t need JS to display a store listing, for instance. Or a news page, or documentation, or even a search engine
As a web dev, I’ll say that yes, it is achievable. The problem isn’t what’s possible, but that we’ve trained new frontend devs in certain ways and given them certain tools. Those tools are being used in places they shouldn’t, and those same new frontend devs are failing to learn the fundamentals of HTTP and HTML.
React, for example, is a JavaScript framework that’s become incredibly popular in recent years. It’s meant for “single page applications”. I once made a control panel for a vacuum former with it, where you could turn on zones of heating and get the temperature updated in real time. You’re not expected to navigate away from that page while you’re using it. I think this is a good place to use React, though you could make the argument that it should be a native GUI app. (I’ll say that it isn’t that important; this thing runs fine on a Raspberry Pi 3, which is the target platform).
React is not a good option for an ecommerce site. You want to click on a product to check out its details. That means you’re going between very different views (pages) a lot. React increases complexity with no clear gain. An argument can be made for the address/payment/finalization steps. The money people like that because there’s a strong correlation between streamlining checkout and how often cash ends up in their hands.
A lot of those sites use React, anyway, for everything. Why? Because we’ve trained a bunch of new frontend devs so much on it that they have no idea how to make a site without React. This overspecialization has been detrimental.
Yeah, it’s not a hard concept, it is an impossible concept.
Love it when a page loads, and it’s just a white blank. Like, you didn’t even try. Do I want to turn JS on or close the tab? Usually, I just close the tab and move on. Nothing I need to see here.
React tutorial are like that. You create a simple HTML page with a script and the script generates everything.
I had to do a simple webpage for an embedded webserver and the provider of the library recommended preact, the lightweight version of react. Having no webdev experience, I used preact as recommended and it is a nightmare to use and debug.
So that’s why.
You’re correct, and I’m going to explain how this happens. I’m not justifying that it happens, just explaining it.
It isn’t that no one knows what graceful degradation is anymore. It’s that they don’t try to serve every browser that’s existed since the beginning of time.
When you develop software, you have to make some choices about what clients you’re going to support, because you then need to test for all those clients to ensure you haven’t broken their experience.
With ever-increasing demands for more and more software delivery to drive ever greater business results, developers want to serve as few clients as possible. And they know exactly what clients their audience use - this is easy to see and log.
This leads to conversations like: can we drop browser version X? It represents 0.4% of our audience but takes the same 10% of our testing effort as the top browser.”
And of course the business heads making the demands on their time say yes, because they don’t want to slow down new projects by 10% over 0.4% of TAM. The developers are happy because it’s less work for them and fewer bizarre bugs to deal with from antiquated software.
Not one person in this picture will fight for your right to turn off JavaScript just because you have some philosophy against it. It’s really no longer the “scripting language for animations and interactivity” on top of HTML like it used to be. It’s the entire application now. 🤷♂️
If it helps you to blame the greedy corporate masters who want to squeeze more productivity out of their engineering group, then think that. It’s true. But it’s also true that engineers don’t want to work with yesteryear’s tech or obscure client cases, because that experience isn’t valuable for their career.
This has to be fixed though. I don’t know, how, but it’s an economic situation bringing enormous damage every moment.
And most of people it affects are, like me, in countries where real political activism is impossible.
This is the next thing that should be somehow resolved like child labor, 8-hour workdays, women’s voting rights and lead paint. Interoperability and non-adversarial standards of the global network.
What should be fixed is people. The above described logic is true, it does really happen, and behind it is the idiot desire: to get more money. Not to make a better thing, not to make someone’s life better, not to build something worthwhile - in other words, nothing that could get me out of bed in the morning. When that’s the kind of desires fueling most companies and societies, all things will be going in all kinds of wrong ways
the idiot desire to get more money
Yes, but we don’t have to make a total caricature out if it. We all need to prioritize our time. That isn’t evil, or broken, or wrong. That’s just life.
Expand this, please. I am sure I did not get you
Developers having a narrower list of browsers to support is not ONLY about greed. You say it is NOT about making something that works to improve people’s lives. And I disagree with that.
You can’t build a good piece of software and try To support every client under the sun since the beginning of time. There is a reasonable point to draw some lines and prioritize.
So while greed is ONE factor, you seem to be saying it’s the only factor, and that people are stupid and broken for doing this. That’s going too far.
It’s unrealistic to expect perfection. Today people want comprehensive client support. Tomorrow they will be outraged at some bug. But few realize: you may have to pick between the two. Because having zero bugs is a lot more achievable if you can focus on a small list of current browser clients. That’s just a fact. The next day they will be upset that there are ads in the site, but it may be ad revenue that pays for developers to fix all the bugs for all browser clients under the sun.
People love to rant online about how NO you should give me EVERYTHING and do it for FREE but this is childish tantruming and has no relationship to reality. Devs are not an endless resource that just gives and gives forever. They are regular people who need to go home at night like anyone else.
I am saying it is about greed because it actually is, since I am yet to see a situation where the ultimate filter for supporting/dropping a client is NOT revenue from people using that client, and here I am talking specifically about companies making money on their product, so no open source, subsidised, hobby projects etc.
Peole love to rant online
They do, now try to catch precisely me doing this
It’s worse than this even. I have an old Raspberry Pi 3B+ (1G) that I got in 2018. I hooked it up the other day to mess around with it, it’s been maybe 2 years since I did anything with it, ever since I got a Pi 4 (4G). 1 gigabyte of RAM is now insufficient to browse the web. The machine freezes when loading any type of interactive site. Web dev is now frameworks piled on frameworks with zero consideration for overhead and it’s pure shit. Outrageous.
You want to see terrible try looking at the network tab in inspect element
“Modern” pages load hundreds of large assets instead of keeping it smaller and clean.
its also cdn on cdn nobody does local libraries anymore
I don’t know how you’re gonna get everything to work without JavaScript. You can’t do a lot of interactivity stuff without it.
I’ve had news articles not work without javascript. (unpaywalled as well).
Do “the stuff” on the server, only serve HTML. In my first job we build a whole webshop with very complex product configurators that would today even run perfectly fine in dillo.
Most don’t even know
@media (prefers-color-scheme: dark/light)
, rather cobble something with JS that works half of the time and needs buttons to toggle.A button to toggle is good design, but it should just default to your system preferences.
Bookmarking this, so far I’ve cobbled my Dark/Light Mode switch together with Material-UI themes, but this seems like the cleaner way to do this that I’ve been searching for!
Also note
prefers-reduced-motion
for accessibility.
I hate “dark mode” so much
Don’t default to it as it makes the page hard to read and ugly. If you want make it optional that is fine but don’t force it.
I hate “light mode” so much
Don’t default to it as it makes the page as blinding as the mid-day sun. If you want make it optional that is fine but don’t force it.
It is substantially harder to make a modern website work without JavaScript. Not impossible, but substantially harder. HTML forms are not good at doing most things. Plus, a full page refresh on nearly any button click would be a bad experience.
I’ve spent the last year building a Lemmy and PieFed client that requires JavaScript. This dependency on JavaScript allows me to ship you 100% static files, which after being fully downloaded, have 0 dependency on a web server. Without JavaScript, my cost of running web servers would be higher, and if I stopped paying for those servers, the client would stop working immediately. Instead, I chose to depend heavily on JavaScript which allows me to ship a client that you can fully download, if you choose, and run on your own computer.
As far as privacy, when you download my Threadiverse client* and inspect network requests, you will see that most of the network requests it makes are to the Lemmy/PieFed server you select. The 2 exceptions being any images that aren’t proxied via Lemmy/PieFed, and when you login, I download a list of the latest Lemmy servers. If I relied on a web server for rendering instead of JavaScript, many more requests would be made with more opportunities to expose your IP address.
I truly don’t understand where all this hate for JavaScript comes from. Late stage capitalism, AI, and SAS are ruining the internet, not JavaScript. Channel your hate at big tech.
*I deliver both web and downloadable versions of my client. The benefits I mentioned require the downloaded version. But JavaScript allows me to share almost 100% code between the web and downloaded versions. In the future, better PWA support will allow me to leverage some of these benefits on web.
The matter is not javascript per se but the use companies and new developers do, if everyone used like you there would probably be no problem. A gazillion dependencies and zero optimization, eating up cpu, spying on us, advertisements…
And if you try and use an alternative browser you know many websites won’t work.
Problem is so many websites are slow for no good reason.
And JS is being used to steal our info and push aggressive advertisment.
Which part is unknown to you?
I don’t understand why we are blaming the stealing info part on JavaScript and not the tech industry. Here is an article on how you can be tracked (fingerprinted) even with JavaScript disabled. As for slow websites, also blame the tech industry for prioritizing their bottom line over UX and not investing in good engineering.
Problem is so many trains are ugly for no good reason.
And steel is being used to shoot people and stab people aggressively.
Developers are still familiar with the concept, there are even ideas like server side rendering in react to make sites more SEO friendly.
I think the biggest issue is that there is very little business reason to support these users. Sites can be sued over a lack of accessibility and they can lose business from bad ux, so they are going to focus in those two areas ten times out of ten before focusing on noscript and lynx users. SEO might be a compelling reason to support it, but only companies that really have their house in order focus in those concerns.
Some surely know, they just don’t care.
Blame the ui frameworks like react for this. It’s normalized a large cross-section of devs not learning anything about how a server works. They’ve essentially grown up with a calculator without ever having to learn long division.
Not all frameworks are bad
The problem is the devs/owners not understanding basic fundamentals. They could see a major financial benefit if they make the page snappy and light but apparently no one at these companies realizes that.
the depressing reality is that 95% of business owners are comically incompetent
Fucking Reddit and their shite navigation controller that shits the bed when you zoom in.
I wrote my CV site in React and Next.js configured for SSG (Static Site Generation) which means that the whole site loads perfectly without JavaScript, but if you do have JS enabled you’ll get a theme switching and print button.
That said, requiring JS makes sense on some sites, namely those that act more like web apps that let you do stuff (like WhatsApp or Photopea). Not for articles, blogs etc. though.
requiring JS makes sense on some sites, namely those that act more like web apps that let you do stuff (like WhatsApp
I mean yes, but Whatsapp is a bad example. It could easily use no JavaScript. In the end it’s the same as Lemmy or any other forum. You could post a message, get a new page with the message. Switching chats is loading a new page. Of course JavaScript enhances the experience, makes it more fluid, etc, but messengers could work perfectly fine without JavaScript.
How would a page fetch new messages for you without JS?
You don’t. That’s the gracefull degradation part. You can still read your chat history and send new messages, but receiving messages as they come requires page reload or enabling js.
my only issue with this ideology(the require page load) is, this setup would essentially require a whole new processing system to handle, as instead of it being sent via events, it would need to be rendered and sent server side. This also forces the server to load everything at once instead of dynamically like how it currently does, which will increase strain/load on the server node that is displaying the web page, while also removing the potential of service isolation between the parts of the web page meaning if one component goes down(such as chat history), the entire page handler goes down, while also decreasing page response and load times. That’s the downside of those old legacy style pages. They are a pain in the ass to maintain, run slower and don’t have much fallover ability.
It’s basically asking the provider to spend more to: make the service slower, remove features from the site (both information and functionality wise) and have a more complex setup when scaling, to increase compatibility for a minor portion of the current machines and users out there.
this is of course also ignoring the increase request load as you are now having to resend entire webpages to get data instead of just messages/updates too.
The web interface can already be reloaded at any time and has to do all of this. You seem to be missing we’re talking about degradation here, remember the definition of the word, it means it isn’t as good as when JS is enabled. The point is it should still work somehow.
If it’s a standard webpage that only displays some static content, then sure.
But everything that needs to be interactive (and I’m talking about actual interactivity here, not just navigation) requires Javascript and it’s really not worth the effort of implementing fallbacks for everything just so you can tell your two users who actually get to appreciate this effort that the site still won’t work because the actual functionallity requires JavaScript.
It all comes down to what the customer is ready to pay for and usually they’re not ready to pay for anything besides core functionallity. Heck, I’m having a hard enough time getting budget for all the legally required accessibility. And sure, some of that no script stuff pays into that as well, but by far not everything.
Stuff like file uploads, validated forms and drag and drop are just not worth the effort of providing them without JS.
file uploads and forms are the easiest to do server side
Not if you want them to be at least halfway user friendly. Form validation is terrible when done completely server side, and several input elements like multiselect dropdowns, comboboxes and searchfields won’t work at all unless supported by client side JavaScript. And have you ever tried to do file previews and upload progress bars purly serverside?
So I guess by fileupload you mean “drop file here and wait an uncertain amount of time for the server to handle the file without any feedback whatsoever.” and by forms you mean “enter your data here, then click submit and if we feel charitable we may reward you with a long list of errors you made. Some of which could have been avoided if you knew about them while filling in previous fields”.
So - the situation is understood, but the question arises, what does this have in common with a global hypertext system for communication.
Maybe all this functionality should be removed into a kind of plugin, similarly to how it was done with Flash and Java applets and other ancient history. Maybe sandboxed, yes.
Maybe the parts of that kind of plugin relating to DOM, to execution, to interfaces should be standardized.
Maybe such a page should look more like a LabView control model or like a Hypercard application, than what there is now.
One huge benefit would be that Google goes out of business.
The business customer or the visitor?
The visitor doesn’t exactly have a way to give feedback on whether they’d use a static page.
Stuff like file uploads, validated forms and drag and drop are just not worth the effort of providing them without JS.
Honestly many of today’s frameworks allow you to compile the same thing for the Web, for Java for Android, for Java for main desktop OS’es and whatever else.
Maybe if it can’t work like a hypertext page, it shouldn’t be one.
The business customer who actually pays for the development.
Maybe if you can’t use the web without disabling JS, you shouldn’t?
Progressive Web Apps are the best tool for many jobs right now because they run just about everywhere and opposed to every single other technology we’ve had up until now they have the potential to not look like complete shit!
And the whole cross compilation that a lot of these frameworks promise is a comete pipe dream. It works only for the most basic of use cases. PWAs are the first and so far only technology I’ve used that doesn’t come with a ton of extra effort for each supported plattfrom down the line.
The business customer who actually pays for the development.
Then it’s my duty as a responsible customer to not make it profitable for them, as much as I can.
Maybe if you can’t use the web without disabling JS, you shouldn’t?
Suppose I can use the Web with JS disabled. Just that page won’t be part of my Web.
Yes, of course when the optimization work has been done for you, it’s the easiest.
It’s an old discussion about monopolies, monocultures, standards, anti-monopoly regulations, where implicit consent is a thing and where it isn’t, and how to make free market stable.
Funny, from my standpoint, more functional JavaScript almost always feels like service degradation - as in, the more I block, the better and the faster the website runs.