For sale:
16 million textures
3 bytes each

The future of backgrounds on the web.

Algorithmic textures are coming to the web

This document contains:

Algorithmic textures are coming to the web

There are a number of problems associated with the existing solution of tesselating images to generate backdrops. Some of these are referred to in the
Keeping an eye on the web document.

Basically there are two main problems:

  • High bandwidth;
  • Not object oriented.
The first of these is the more serious and will be covered first:
  • High bandwidth

    • Existing web backgrounds are made by tesselating bitmapped images. This produces regular patterns which can oly be reduced by increasing the size of the bitmaps involved. While it is possible to use quite small JPEG images as backgrounds, this solution is far from optimal (as I explain elsewhere).
    • Many web browsers currently try to fetch a page's background before rendering any text. This means that no navigation of the page can occur until this image has been downloaded. If they cannot get a response from the server over this file, the rest of the HTML document can sometimes remain undisplayed until the user clicks on STOP. Even if the image is received without problems, it has to be decoded and displayed before anything else can be done. See the quote from Netscape's site for details of this.
    • This current state of affairs in some browsers means that the size of a background image is of critical importance. As mentioned elsewhere I have encountered a number of webmeisters who shy away from background images on their commercial sites for fear of losing their window shopping customers to less graphics-intensive pages.
    • The correct method of displaying backdrops by browsers is not to load them first. Most modern browsers employ this technique with varying degrees of success. Most browsers which do try to do this completely clear and redraw the entire web page when the background had loaded. This effect is needlessly intrusive. Exactly when the backdrop should be loaded probably depends on what the user is doing. If they are scrolling around the file frantically, then the background should be left until the last image has been rendered. If they are looking at one part of a page, and all the parts that are looking at have been rendered, then the backdrop should be loaded. A second bitmap copy of the browser window should be rendered while it is not being displayed with the background in place. The first window should then be faded into the second one using simple linear interpolation between the RGB values of the colours. This will prevent any intrusive flash of the screen. This effect will be implemented because it looks so cool, and all other browsers will have to follow suit or put up with looking clunky and crap.
    • The fact that in some browsers, no navigation of the page can occur until the background has been downloaded leads to economies in background image size, and sometimes to no background image being used. This not only looks bad but as described in has been described in Keeping an eye on the web can have a negative impact on the users.
  • Not object oriented

    • This is a point that is mainly relevant because of difficulties in resizing bitmaps.
    • Algorithmic textures may be easily generated at any size, and when they are zoomed into, more and more detail becomes evident. They are fractal in a sense, even though they may not be self similar in form at all scales, they do in a sense "go all the way down".
    • At the moment few browsers offer resizing and zooming options. The default size of text on a web page is often only editable through a menu structure, or worse by editing configuration files. However, it is clear that what some users want is a resize ixon in each corner of the browser's window, which can simply be dragged to make the whole page larger or smaller. Limitations in bitmap resizing will become more evident when this is feature becomes more widespread.
    • At the moment many browsers do not even use interpolation in resizing images. This could have negative effects when scaling some GIFs, but if you can imagine a 128x128 JPEG scaled to 132x132, then the problem becomes manifest at once. The W3C are aware of this problem and have set out some requirements for the kind of proposals they would be prepared to examine, though they do not yet seem to have any proposals which they are prepared to put their name to.

A speculative roadmap to the land of algorithmic textures

Algorithmic textures will become incorporated into browsers from two directions.
  • A texture backdrop file format for the web will come into being. This will not be a bit map, but instructions and guidelines about how to construct a bitmap to the client. The client will then be able to generate an image of an appropriate size.
  • For browsers with low bandwidth connections there will be an option very similar to the alt= parameter in <img> tags. This will be a more simple algorithmic texture generator. Using options specified in the <body> tag, it will be capable of generating an image in preference to loading a bitmap backdrop, and in the absence of a texture generation file. For text only browsers, this could even be translated into a description of the backdrop image, in the absence of any alt= text.
The first option is the hardcore long term solution. The second is useful in illustrating a method by which the first option may be attained.

As mentioned in the title of this page, it is possible to gain access to 16 million textures using only 3 bytes to specify each one. Surely this must be the ultimate in data compression of textures.

In practice it this option would be of limited practicality, but it certainly points the way to go.

Some may doubt the claims here that browsers will soon support algorithmic textures. However, these people are invited to consider the following:

  • Browsers can support algorithmic textures using simple existing technology, and remain compatable with the rest of the network.
  • Pages which use algorithmic textures will have bitmaps as backups in case other browsers do not yet support algorithmic texture generation (similar to alt=text again).
  • On browsers which do offer support, backgrounds will be available much faster, will be in 24 bit colour, and will not suffer from repetitive pattern problems.
  • Algorithmic textures are perceived as being cool technology; once one browser supports them everyone will want them.
  • Algorithmic textures can allow an easy implementation of letting different areas of the screen be different colours, textures, and shades. Crazy stunts like stretching textures to very long thin horizontal strips so that they provide an attractive left-hand edge to a table would no longer be required.
  • Algorithmic textures can be simply animated, allowing for a more modern equivalent of the animated GIF backdrop, only with a tiny fraction of the download time. This feature alone, although it is very gimmicky, would be sufficient to sell people the idea of algorithmic textures.
All that is needed is a standard.

Such a standard is not currently likely to come from the likes of W3C. As far as the author is able to see, there is no current discussion of the matter taking place. Though the standard would have to be a completely open one, the advantages of being in the position of the developing company should be self-evident. Not only would this contribute to a reputaion for innovative and cutting edge web technology, but would also give a head start over the competition in the area, and offer influence over the future evolution of the standard.

In a similar manner to the way in which script files are currently embedded inside HTML documents, texture generation commands could be embedded into the <body> tag.

Concrete proposals about a vehicle to use to get there.

The HTML attributes to the <body> tag proposed are initially as follows:
  • <body background=back.jpg text=etc...> as normal;
  • <body background=back.alg text=etc...> using Algorithmic texture filetype standard instead of JPEG or GIF;
  • <body ATType=FractNoise ATData=#10F03320 text=etc...> - embedded algorithmic texture.
A list of useful initial attributes could include:
  • ATType - Text description of the type of texture to be generated;
  • ATData="#xxyyzz" - Parameters for ATType - by using these two commands alone, great things could be achieved;
  • fgcolor=#RRGGBB - Foreground colour for simple textures.
  • bgcolor=#RRGGBB - Background colour for simple textures.
  • ATDim - Fractal dimension for fractal noise type textures.
etc.

The texture generation program would need to be written in a low level language (for best speed) and Java (for portability) at the very least.

A brief description of the interface of the texture generation plug-in with the browser follows (a more detailed description is here):

The browser asks for a texture to be generated from a referred-to HTML fragment.

It specifies:

  • Colour depth (1-24 bit);
  • Use existing palette option (if not 24 bit);
  • What type of dithering to use (if not 24 bit);
  • Width and Height. These would normally be -1. However, algorithmic textures can be easily made to tesselate, and at the browsers request, this should be an option. This may be due to the user wanting to speed up texture generation, or by the program running short of memory.
For some light relief from these matters, readers are invited to peruse the only other proposal concerning alternative web textures the author has encountered.

The technology to pefrom all the algorithmic texture generation described in this document is available now.

<IMG> tag proposals | <BODY> tag proposals | Background proposals | The browser | Acorn | Java | News

[To Index]

© Tim Tyler, 1996-1997.