Websites these days are horribly RAM hungry, especially with all of, you guessed it, javascript.
For shits and giggles, a few months back I pulled out an older P166 with 48MB of RAM. I was able to run a relatively modern browser that could render modern websites. Trouble was, it ran out of RAM and kept spinning up the virtual memory, taking FOREVER to load the site.
I think that was the biggest issue. Even on my G3. I maxed out the RAM on my G3 awhile back and noticed a HUGE improvement in performance with tenfourfox. I have learned if you do not at the bare minimum have 1GB of RAM for the browser to "fill up" when parsing/rendering, your going to have serious issues.
I dont see a 68K or any machine under 1GB of RAM able to render modern sites, without at least some sort of helper CPU to do the heavy crunching and even to just be used as memory.
I will probably get flamed for this, but honestly in my opinion modern websites being this way are the result of "bad programming" and dependency-hell. There was a good video I watched on youtube the other day about that very topic, about how we need faster computers to run slower software at the same speed. I think the link came from here even.
It actually amazes me that computers today with modern software really dont seem any faster than old computers I use with period correct software. Now granted old software systems dont have the same features modern ones do, plus trendy changes (UI), etc.. However the magnitude of the change doesnt warrant the "bloat" of the change. But I digress.