Modern web bloat means some pages load 21MB of data - entry-level phones can't run some simple web pages, and some sites are harder to render than PUBG

Tecno Spark 8C, a low-spec phone that can hit 40 FPS in Battle Royale PUBG but lags below 1 FPS on social media.
Tecno Spark 8C, a low-spec phone that can hit 40 FPS in Battle Royale PUBG but lags below 1 FPS on social media. (Image credit: Tecno)

Earlier this month, Danluu.com released an exhaustive 23-page analysis/op-ed/manifesto on the current status of unoptimized web pages and web app performance, finding that just loading a web page can even bog down an entry-level device that can run the popular game PUBG at 40 fps. In fact, the Wix webpage requires loading 21MB of data for one page, while the more famous websites Patreon and Threads load 13MB of data for one page. This can result in slow load times that reach up to 33 seconds or, in some cases, result in the page failing to load at all.  

The main measurements are in the table below (expand the tweet to see the table), which measures the Largest Contentful Paint (LCP) time across several websites and low-spec devices. As LCP's full name implies, this is usually the time between a user opening the page and the device rendering its primary content. In addition to the time it takes to load, the bandwidth demands of each site are shown in the left-hand columns. The testing includes the lowest-end Itel P32, the entry-level Tecno S8C, and more powerful systems like the Apple M3 Max, M1 Pro, and M3. 

As the testing above shows, some of the most brutally intensive websites include the likes of...Quora, and basically every major social media platform. Newer content production platforms like Squarespace and newer Forum platforms like Discourse also have significantly worse performance than their older counterparts, often to the point of unusability on some devices.

The Tecno S8C, one of the prominent entry-level phones common in emerging markets, is one particularly compelling test device that stuck. The device is actually quite impressive in some ways, including its ability to run PlayerUnknown's Battlegrounds Mobile at 40 FPS— but the same device can't even run Quora and experiences nigh-unusable lag when scrolling on social media sites.

That example is most likely the best summation of the overall point, which is that modern web and app design is increasingly trending toward an unrealistic assumption of ever-increasing bandwidth and processing. Quora is a website where people answer questions— there is absolutely no reason any of these websites should be harder to run than a Battle Royale game.

The full piece goes into much more detail on these points and includes several citations and quotes from web developers. For example, founder and former CEO of Discourse Jeff Atwood was quoted as saying Qualcomm is "terrible at their jobs. I hope they go out of business"...because Qualcomm CPUs were 15% behind Apple's. That feels like an extreme statement about a manufacturer that provides the majority of smartphone SoCs available on the market. You can read the full report here for more details about the tests and resulting performance measurements. 

  • Neilbob
    Back in my day I had brief period (very late 90s/early 2000s) of being a website designer, and the overriding mantra was always to keep everything as small and compact as it could possibly go because the assumption was that the majority of people would be using 56k (which usually equated to 3.3k/sec) dial-up connections, with a few lucky sods having 128k ISDN or maybe even 256k ADSL! Any element over 100k in size (maybe 250k for the very biggest important stuff) was out of the question. Perhaps it's time for companies to return to that mindset, even in this time of fibre and 100s of Mbits, and design efficiently for the lowest common denominator.

    We also had to jump through a million hoops to ensure sites worked properly with Internet Explorer 6, but that was a whole other issue. Fond memories they are not.

    Those times are long past. I wouldn't have a clue now.
    Reply
  • Alvar "Miles" Udell
    Much of what makes most sites slow and resource intensive these days are the trackers and garbage that loads alongside the page you want. For example, the site of this tab runs with 67.3MB RAM with only tomshardware, Twitter, and futurecdn enabled, with everything allowed it uses 118MB RAM, and none of those extra things enabled additional pictures, tables, charts, or anything they just eat resources. According to Solarwinds Pingdom's ( https://tools.pingdom.com/ ) report for the same page, with everything enabled there are 89(!) requests, with all but 35 coming from non futurecdn and tomshardware domains.

    And it's not just a problem on low end hardware and developing countries, anyone who has used dialup this millennium or had to use sub 5mbps satellite or cell internet these days, as many people have to since they don't have access to 5G or strong unthrottled 4G.
    Reply
  • JamesLahey
    Neilbob said:
    Back in my day I had brief period (very late 90s/early 2000s) of being a website designer, and the overriding mantra was always to keep everything as small and compact as it could possibly go because the assumption was that the majority of people would be using 56k (which usually equated to 3.3k/sec) dial-up connections, with a few lucky sods having 128k ISDN or maybe even 256k ADSL! Any element over 100k in size (maybe 250k for the very biggest important stuff) was out of the question. Perhaps it's time for companies to return to that mindset, even in this time of fibre and 100s of Mbits, and design efficiently for the lowest common denominator.

    We also had to jump through a million hoops to ensure sites worked properly with Internet Explorer 6, but that was a whole other issue. Fond memories they are not.

    Those times are long past. I wouldn't have a clue now.
    When I think of it I can hear the familiar beep-boop, pause, screech!

    I wonder how much horsepower of modern web is directed squarely at cross-site tracking and all the greasy undercarriages of web design…
    Reply
  • jeremyj_83
    Neilbob said:
    Back in my day I had brief period (very late 90s/early 2000s) of being a website designer, and the overriding mantra was always to keep everything as small and compact as it could possibly go because the assumption was that the majority of people would be using 56k (which usually equated to 3.3k/sec) dial-up connections, with a few lucky sods having 128k ISDN or maybe even 256k ADSL! Any element over 100k in size (maybe 250k for the very biggest important stuff) was out of the question. Perhaps it's time for companies to return to that mindset, even in this time of fibre and 100s of Mbits, and design efficiently for the lowest common denominator.

    We also had to jump through a million hoops to ensure sites worked properly with Internet Explorer 6, but that was a whole other issue. Fond memories they are not.

    Those times are long past. I wouldn't have a clue now.
    I'm sure a lot of the bloat is from ads on the webpage. Doesn't matter how efficiently you make the page if most of the size is from ads.
    Reply
  • Sippincider
    Our corporate IT had the big idea to block Google on all their devices (security!). Then quickly relented after discovering how many pages, including a few of their own, use Google fonts...

    Is very annoying when what should be a straightforward page needs to connect to half of the Internet to render. Why does a page which has nothing to do with Facebook, sit and hang waiting for Facebook?
    Reply
  • bit_user
    Yup. I keep thinking about this, when I read people say you don't need more than (insert some ancient, low-spec CPU) for mere web browsing. The web is increasing in complexity, just like any other software. It evolves with the client platforms that use it, and is optimized only to the point where it's usable on the mainstream machines most web developers are probably using.

    Sadly, so much of the heft of the modern web is from spy-ware. Video ads can also bog a low-spec device, terribly. Firefox has an option to disable autoplaying videos, but it often doesn't seem to work.
    Reply
  • mitch074
    Tracking is one thing. Loading 5 Mb of JS libs to run an animation is unfortunately standard nowadays, as most WordPress "developers" will ignore ressource optimisation.
    Reply
  • bit_user
    founder and former CEO of Discourse Jeff Atwood was quoted as saying Qualcomm is "terrible at their jobs. I hope they go out of business"...because Qualcomm CPUs were 15% behind Apple's.
    Good point. I hope someone clapped back with how much slower his Discourse platform is than one of his competitors. I'll bet it's a lot worse than 15%!
    Reply
  • derekullo
    On my router I've got the DNS server set to AdGuard DNS ... 94.140.14.14 and 94.140.14.15.
    Good first line of defense that applies automatically to all devices that connect to my wifi or wired connection.
    https://adguard-dns.io/en/public-dns.html
    On most all computers I use a custom hosts file to block known malicious and browser slowing sites.
    https://winhelp2002.mvps.org/hosts.htmIt's a bit dated ... 2021, but it works!

    I also use firefox with noscript and adblock to block any unwanted ads and trackers that got through the above.
    Reply
  • Neilbob
    jeremyj_83 said:
    I'm sure a lot of the bloat is from ads on the webpage. Doesn't matter how efficiently you make the page if most of the size is from ads.
    That's certainly true. It didn't really occur to me, but if there was an ad back then, it was usually a tiny little GIF image. And not more than a couple of them.

    Ugh, simpler times. Now I want to sit on my rocking chair on the porch and observe the setting sun while sipping a warm beverage.
    Reply