private: No

title: Why is the web the way it is?
date: 2023-07-31


I know that the author presents the VNC idea facetiously, but I've
always wondered how we got to where we're at technologically.

Why did the hypertext and "thick client" approach to a general-purpose,
network-accessible, platform-independent, graphical software platform
come to dominate?

Why aren't interactive graphical applications more commonly served over
the network using other methods, such as (1) remote framebuffering or
(2) streaming and executing a virtual machine image on the client-side?

The primary roadblock to approaches involving "thin clients" and
server-side rendering has always been network latency. You're limited by
the speed of light, so to ensure usable performance, you would need to
make sure a server is always close to your users. The user experience on
secure shell, let alone remote graphical applications, can totally suck
on networks with high latency.

The other issue with remote framebuffering is web scraping. How would a
search engine crawler scrape a site streamed over a remote framebuffer
or ran locally with a virtual machine image? You'd probably need to
invent some data format that would be additionally output from the RFB
or VM. This adds to the complexity (and also doesn't solve the latency

I have a distaste for how complicated it has become to implement the web
standards required to create a browser. However, HTML and friends do
have some features that would appear difficult to implement with
non-hypertext approaches:

* General-purpose, network-accessible, graphical software platform.
* General-purpose hyperlinks, not to mention URI fragments.
* Usable for both documents and applications.
* More latency-tolerant than "ultra-thin, server-side-rendering"
* Includes scripting language for client-side processing where the
  network latency is too high (or servers want to avoid the load).
* Relatively crawler-friendly format for structured and unstructured
  data and metadata.
* The HTTP concept of "resources" has helped HTTP become the network
  protocol of choice for even native applications.
* Low barrier to entry with primitives that make it easy for authors
  using basic software. (Slap `h1` and `p` tags in a text file, and
  you've got document structure.)

The web platform is very heavy, but you get a lot of power, e.g.
features. For browser implementers, the cost is high, but it makes a lot
of sense for business and commercial applications.

Therefore, it begs the question if one could create a system that
replicates (or exceeds) the features of the web but at a significantly
lower implementation complexity. (For example, do you really need 3 web
languages (HTML, CSS, JavaScript), or could you get away with a unified
language to describe structured data, presentation, and client-side