#?.info

Been using this today:

https://cambridgez88.jira.com/wiki/spaces/OZVM/overview

The Z88 was the last computer released by Sinclair Research (using the name Cambridge Computer as Amstrad by then had bought the rights to the Sinclair name.) The Z88 was an A4-paper (that's “Like Letter-size but sane” to 'murricans) sized slab-style laptop computer. By slab-style I mean the screen and keyboard were built into a single rectangular slab, it didn't fold like a modern laptop. It was Z80 based, had solid state memory, and a 640x64 monochrome (supertwist LCD) display which looked gorgeous. There was 32k of battery backed RAM but my understanding is functionality was very limited unless you put in a RAM expansion – other than the Spectrum that was a Sinclair trademark. In classic Sinclair style it had a rubber “dead flesh” keyboard, though there was a justification given, that the keyboard was “quiet” and that was probably legitimately a selling point.

Sir Clive had a dream dating back to the early 1980s that everyone should have a portable computer that was their “main” computer. The idea took shape during the development of the ZX81, and was originally the intended use of the technologies that went into the QL. Some of the weirder specifications of the QL, such as its 512x256 screen being much wider than the viewable area of most TVs, came from Sinclair's original intention to use a custom CRT with a Fresnel lens set up as the main display for the machine. Early on it was found that the battery life of the portable computer designed around the ZX83 chips was measured in minutes, and the idea was discarded. (I believe, from Tebby's account, that the ZX83 chips remained unchanged because they started to have difficulty getting new ULA designs tested.)

So... after selling up to Amstrad, Sinclair tried one last time and made a Z80-based machine. He discarded both Microdrives (which weren't energy efficient, and I suspect belonged to Amstrad at this point) and his cherished flat screen CRT technologies (which were widely criticized) and finally adopted LCDs. And at that point it looks like everything came together. There were still issues – the machine needed energy efficient static RAM which did (and does) cost a small fortune, so the Z88 had limited storage in its base form. Flash was not a thing in 1988, EEPROMs were expensive and limited, but more conventional EPROMs (which used UV light to reset them) were affordable storage options.

So, with a combination wordprocessor/spreadsheet (Pipedream), BASIC, Calendar/clock, and file management tools, the computer was definitely finally useful.

I never got a Z88 as I was still a teenager at the time and the cost was still out of my league. When I got my QL it was 80GBP (on clearance at Dixons) which I just had enough savings for. Added a 25GBP monitor a few months later. But that gives you some idea of the budget I was on during the height of the original computer boom.

Anywho, IIRC the Z88 ended up being around 200GBP and the media was even more expensive, which would have been a hell of a gamble for me at the time given despite Sir Clive's intentions it was far from a desktop replacement. It had limited programmability – it came with BBC BASIC (not SuperBASIC, as Amstrad now had the rights to that) but otherwise development was expensive. And a 32K Z80 based computer in 1988 was fairly limited.

But I really would have gotten one had I had the money. I really loved the concept.

The emulator above comes as a Java package that requires an older version of Java to run. It wouldn't start under OpenJDK 17 (as comes with Debian 12), but I was able to download OpenJDK 6 from Oracle's site (https://www.oracle.com/java/technologies/javase-java-archive-javase6-downloads.html) which ran fine from the directory I installed it into without having to mess with environment variables.

Anyway, a little glimpse into what portable computing looked like in the 1980s, pre-smartphones and clamshell laptops.

See also:

There's also the ill-fated Commodore LCD, a 6502 KERNAL based system designed by Bill Herd. It wasn't a slab, having a fold out screen, but was similar in concept. It was killed by an idiotic Commodore Manager who asked Radio Shack if they should enter the market with a cheap laptop, and who believed the Radio Shack executive he spoke to when said exec told him there wasn't a market. Radio Shack was, of course, selling the TRS-80 Model 100 at the time, and making money hand over fist.

Final comment: these types of slab computer weren't the only “portable computers” in the 1980s. Excluding luggables (which weren't true portables in the sense they couldn't run without a mains power connection), and a few early attempts at clamshell laptops, there were also “pocket computers”. Made mostly by Casio and Sharp, these were miracles of miniaturization, usually with only a few kilobytes of memory at most and a one or two line alphanumeric LCD display. I had a Casio PB-80 which had about 500 bytes of usable memory. (IIRC they called bytes “steps”, reflecting the fact these things were designed by their manufacturer's programmable calculator divisions) They did have full versions of BASIC, and arguably their modern successors are graphing calculators. These devices were nice, but their lack of any communications system or any way to load/save to external media made them limited for anything beyond really simple games and stock calculator functions.

So again, a set of random thoughts. But it culminates with wondering whether the official story behind at least one of the major UI changes of the 21st Century isn't... bollocks.

History of the GUI

So let's go back to 1983. Apple releases a computer called the Lisa. It's slow, expensive, and has a few hardware flaws, notably the stretched pixels of the screen that seemed OK when they were designing it but obviously broke it later on. But to many, it's the first glimpse of the modern GUI. Drop down menus. Click and double click. Icons. Files represented by icons. Windows representing documents. Lisa Office was, by all accounts (I've never used it) the pioneer that set the stage for everything that came afterwards. Apple is often accused of stealing from Xerox, and certainly the base concepts came from Doug Engelbart's pioneering work and Xerox's subsequent development of office workstations, but the Lisa neatly fixed most of the issues, and packaged everything in a friendly box.

The Mac came out a year later, and while the Mac is often described as a low cost version of the Lisa, that's not really fair to the Mac team. The latter were developing, for the most part, their system at the same time as the Lisa, and swapped ideas with one another. The Mac contained hardware changes such as 1:1 pixels that never made it into the Lisa, cut a sizable number of things down so they'd work on lower capacity hardware, and introduced an application-centric user interface compared to the Lisa's more document-centric approach.

Meanwhile Microsoft and Digital Research tried their hands at the same thing, Microsoft influenced primarily by the Lisa, and DR by the Mac, with the latter's GEM system coming out almost exactly a year after the Mac, and Microsoft's Windows, after a lot of negotiations and unusual development choices, coming out nearly a year after that.

The cat was out of the bag, and virtually every 16/32 bit computer after the Macintosh came with a Mac/Lisa inspired GUI from 1985 onwards. There are too many to name, and I'd offend fans of {$random 1980s 16/32 bit platform} by omitting their favorite trying to list them all, but there were a lot of choices, a lot of great and not so great decisions made, some were close to the Mac, others were decidedly different, though everyone adopted the core window, icon, pointer, mouse, scrollbars, drop downs, etc, concepts, from NeXT to Commodore.

By the early 1990s, most mainstream GUIs, Windows and NEXTSTEP excepted, were very similar, and in 1995, Microsoft's Windows 95 brought Windows to within spitting distance of the that common user interface. The start menu, task bar, and the decision to put menus on the tops of every window instead of the screen, distinguished Windows from the others, but it was close enough that someone who knew how to use an Amiga, ST, or Mac could use a Windows PC and vice versa without effort.

Standardization

But what made these UIs acting in a similar way useful wasn't cross platform standardization, but cross application standardization. Even before Windows 95, there was an apex to the learning curve that everyone could reach. If you knew how to use Microsoft Excel, and you knew what a word processor was, you could use Microsoft Word. You could also use Wordperfect. You could also use Lotus 123, at least, the Windows version when it finally came out.

This was because despite differences in features, you operated each in the same way. The core applications built a UI from similar components. Each document had its own window. The menu was ordered in approximately the same way. The File menu allowed you to load and save, the Edit menu allowed block copying, if there was a format menu, that allowed you to change roman to italics, etc. Tools? You could reasonably guess what was there.

Underneath the menu or, in the Mac's case, usually as moveable “palettes” were toolbars, which were frequently customizable. The blank sheet of paper on one let you create a new document, the picture of the floppy save it. The toolbar with the bold B button, underlined U button, and drop down with a list of fonts, allowed you to quickly adjust formatting. So you didn't have to go into the menus for the most common options.

The fact was all programs worked that way. It's hard to believe in 2024, because most developers have lost sight of why that was even useful. To an average dev in 2024, doing the same thing as another program is copying it. But to a dev in 1997, it meant you could release a complex program to do complex difficult understand things that people already knew how to use.

Microsoft breaks everything

You may have noticed that's just not true any more. Technically both, say, Chrome and Firefox have the regular drop down menus still, but they've gone to great levels to hide it, and encourage people to use an application specific “Hamburger menu” instead. And neither has a toolbar. The logic is something like “Save screen space and also make it work like the mobile version”, but nobody expects the latter, and “saving screen space” is... well, an argument for another time.

(Side note: I've been arguing for a long time among fellow nerds that the Mac's “top of screen” rather than “Top of window” approach to having menus is the superior option (I'm an old Amiga user), and explained Fitt's Law to them and how putting a menu at the top of the screen makes it easy to quickly select the menu options when trying to do it when the menu is at the top of a window is fiddly, and usually the response comes back “Oh so you're saying it saves screen space? Pffft who needs to, we all have 20” monitors now”, and I shake my head in annoyance at the fact nobody reads anything any more, not even things they think they're qualified to reply to. Dumbasses. No wonder GUIs have gone to hell. Anywho...)

Anyway, while it's kind of relevant that nerds don't appear to understand why UIs are designed the way they are and aren't interested in finding out why, that's not the point I was making, which was obviously if “We all have 20” monitors now so have plenty of space” is some kind of excuse for wasting desktop space, then refusing to have a menu in the first place isn't justifiable on that basis.

But Google and Mozilla are just following a trend. The trend wasn't set by either (though they're intent on making things worse), and wasn't even set by the iDevices when Apple announced them (though those have given devs excuses to make custom UIs for their apps.) It was set by Microsoft, in 2007, with the introduction of The Ribbon.

The Ribbon is an Office 2007 feature where menus and toolbars have been thrown out and replaced by a hard coded, giant, toolbarish thing. Things are very, very, roughly categorized, and then you have to scan the entire thing to find the function you want on the ribbon tab you think it might appear on because they've been put on in no particular order.

It is, hands down, the single worst new UX element ever introduced in the post-1980s history of GUIs. Not only do you now need to learn how to use an application that uses it, because your knowledge of how other similar applications work no longer applies, but you can spend your whole life not realizing basic functionality exists because it's hidden behind a component in the ribbon that's not immediately relevant.

And learning how to use Excel, and knowing how a word processor works (maybe you used Office 2003?) brings you no closer to knowing how to use Microsoft Word if you use a ribbon version.

Microsoft was roundly criticized for this UI innovation, and a good thing to, but Microsoft decided, rather than responding to criticism, to dig in their heels and wait for everything to blow over. They published usability studies that claimed it was more productive, but it's unclear how that could possibly be true. The claim was also made that it was based upon earlier usability studies. Users, it was claimed, always used the toolbar and almost never used menus, for everything!

Well, no sugar Sherlock. Most toolbars are set up by default to have the most frequently used features on them. And many of the menu options users remember the keyboard shortcuts so use those. So of course people will rarely dig into the menus. The menus are there to make it easy to find every feature, not just the most frequently used features, so it stands to reason they'd be rarely used if they're only being used to find infrequently used functionality!

My personal theory though is that this wasn't a marketing department making a bad choice and wanting to stand by it to save face. This was a deliberate decision by Microsoft to push through a UI change that would intentionally make even Office harder to use. After all, where would the harm have been supporting both user interfaces? Chrome and Firefox do it, and there was nothing in the Ribbon that couldn't have been triggered by a menu.

Anti-Trust and the importance of Office.

The work that lead to the Ribbon almost certainly started shortly after Microsoft's anti-trust problems concluded and during a phase where they were under even more anti-trust scrutiny. Until the 2001 Bush administration, Microsoft had been butting heads with the DoJ culminating in Judge Jackson's finding-of-fact that Microsoft had illegally used its market position to force out competitors.

While Microsoft's issues with Digital Research/Caldera (DR DOS) and IBM (OS/2) were highlights of the FoF, the issues that had sparked intervention were related to their attempts to dominate web browsers and Microsoft's decision to integrate web browsers into their operating system. Microsoft had made the decision to do so in order to own the web, in order to tie what should have been an open standard into the Windows APIs. By 1999, Internet Explorer had an even more extreme hold on Internet usage than Chrome does today, with many websites banning other browsers, and many others being broken on websites that weren't IE. These weren't obscure websites nobody needed to use either, I personally recall being blocked from using various banking and governmental websites at this time.

In 2000, the courts ruled in favor of a break up of Microsoft into an applications company and operating system company. In 2001, this was overturned, but a sizable reason for the appeal court doing so was related to the Judge's conduct rather than the merits of the case. Bush's DoJ stopped pushing for a break-up in late 2001, largely in line with Bush's opposition to anti-trust actions, and Microsoft was given more or less free rein, outside of an agreement to be more open their APIs.

From Microsoft's point of view, “winning” the anti-trust case must have been bittersweet because of the way it was won. The findings of fact were upheld throughout the legal proceedings, and Microsoft only avoided break-up because they successfully wound up the judge enough for him to behave unprofessionally, and because they waited out the clock and were able to get the end of the legal case overseen by a more sympathetic government. There were no guarantees the same thing would happen next time.

It's not clear exactly when Microsoft started to judge Office as being more important than Windows to their future, but certainly in the 2000s we saw the beginning of changes of attitude that made it clear Microsoft was trying to find a way forward that was less reliant on Windows. Office was the most obvious second choice – most corporate PCs run Office, as do a sizable number of non-corporate PCs. Even Macs run Office. Office had a good reputation, it was (and is) extremely powerful. And because of its dominance of the wordprocessing and spreadsheets market, the files it generated were themselves a form of lock-in. If you wanted to interact with another Word user, you needed to use Word. There were third party wordprocessors that tried to support Word's file format, but it turned out supporting the format was only half the problem: if your word processor didn't have the exact same document model that Word did, then it would never be able to successfully import a Word document or export one that would look the same in Word as it would in your third party wordprocessor.

But until 2006, Office's dominance due to file incompatibility wasn't certain. In 2000, Microsoft had switched to a more open file format, and in 2006, under pressure from the EU, had published the complete specification. Critics at the time complained it was too complicated (the entire format is 6,000 pages), but bear in mind this includes the formats for all applications under the Office umbrella.

Two decades later, compatibility from third party applications remains elusive, most likely because of internal model conflicts. But it wasn't clear in the early 2000s that even publishing the Office file formats wouldn't be enough to allow rivals to interoperate within the Office eco-system

The importance of UI balkanization

So, faced with the belief that third parties were about to create office clones that would cut a significant chunk of Microsoft's revenue, and knowing that they couldn't use the operating system any more to just force people to use whatever applications Microsoft wanted users to buy, Microsoft took an extreme and different approach – destroying the one other aspect of interoperability that is required for users to move from one application to another – familiarity.

As I said above, in the late 1990s, if you knew Word, you knew how to use any wordprocessor. If you knew Excel, and you knew about wordprocessing, you could use Word. The standardization of menus and toolbars had seen to that.

To kill the ability of customers to move from a Microsoft wordprocessor to a non-Microsoft wordprocessor, Microsoft needed to undermine that standardization. In particular, it needed a user interface where there was no standard, intuitive, way to find advanced functionality. While introducing such a kludgy, unpleasant, user interface was unpopular, Microsoft had the power to impose such a thing in the early 2000s, as its office suite was a near monopoly. Customers would buy Office with the ribbon anyway, because they didn't have any choice. And with the right marketing, they could even make it sound as if the changes were a positive.

Hence the Ribbon. Until you actually try to use it, it doesn't look unfriendly, making it very easy to market. And for, perhaps, the most popular wordprocessing features, it's no worse than a toolbar. But learning it doesn't help you learn the user interface of any other application. Anyone introduced to wordprocessing through the Ribbon version of Word will have no idea how to use LibreOffice, even if LibreOffice has a ribbon. The user interface will have to be relearned.

Note that Microsoft didn't merely introduce the Ribbon as an optional user interface. Firefox and Chrome, to this day, still have the ability to bring up a traditional menu in addition to their hamburger menu because they know end users benefit from it. It's just, inexplicably, hidden (use the ALT key!) But in Word, there is no menu, there's nothing to make it easier for end users to transition to the ribbon or keep doing things the way they always did, despite the ease with which Microsoft could have implemented that.

We forgot everything

Microsoft's success foisting the Ribbon on the world basically messed up user interfaces from that point onwards. With the sacred cow of interoperable user interfaces slaughtered, devs started to deprecate standardization and introduce “new” ways to do things that ignored why they'd been developed in the first time. Menus have been replaced with buttons, scrollbars have been replaced by... what the hell are those things... and there's little or no logic behind any of the changes beyond “It's new so it doesn't look old”. Many of the changes have been implemented to be “simpler” but in most cases the aesthetic is all that's been simplified, finding the functionality that a user wants to find is harder than ever before.

It would help if devs had realized at the time Microsoft had done this for all the wrong reasons. It's not as if most trust Microsoft or believe they do things for the right reasons.

I started watching a lot of videos on retrocomputing recently. Well, the era they call retro I call “when I learned what I know now”. The 1980s was a fun time, as far as computers were concerned. There was variety, and computer companies were trying new things.

The more jarring thing I watched though was a review of the Timex Sinclair 2068, essentially the US version of the Sinclair Spectrum, which – as you'd imagine from the subject – was a very American view of why that computer failed. And the person reviewing the 2068 felt it failed because it represented poor value compared to... the Commodore VIC 20?

Which now I've spent some time thinking about it, I think I understand the logic. But it wasn't easy. You see, when I was growing up the school yard arguments were not about the ZX Spectrum vs the VIC 20, but it's vastly superior sibling, the Commodore 64. And both sides had a point, or so it seemed at the time.

The principle features of the ZX Spectrum were:

  • A nice BASIC. That was considered kind of important then, even in a world where actually the primary purpose of the computer was gaming. Everyone understood that in order for people to get to the point they were writing games in the first place, the computer had to be nice to program.
  • 48k of RAM, of which 41-42k was available to programmers.
  • A fixed, single, graphics mode of 256x192, with each 8x8 pixel block allowed to use two colours picked from a palette of... I want to say 16 but I can't remember for sure.
  • An awful keyboard. There was a revision called the Spectrum+ that had a slightly better keyboard based on the Sinclair QL's (but not really like the QL's, the QL's had a lighter feel to it.)
  • A beeper type sound device, driven directly by the CPU
  • Loading and saving from tape.
  • A single crude expansion slot that was basically the Z80's pins on an edge connector.

The Commodore VIC 20 had 5k of RAM, 3.5k available. It had a single raw text mode, 22x24 IIRC, with each character position allowed to have two colours. It did allow characters to be user defined. BASIC was awful. Expansion was sort of better, it had a serial implementation of IEEE488 that was broken, a cartridge port, and a serial port. Like the Spectrum it was designed to load and save programs primarily from tape. Despite the extra ports, it just wasn't possible to do 90% of the things a Spectrum could do, so I'm baffled the reviewer saw fit to compare the two. They were only similar in terms of price. And the VIC 20 was way cheaper than the Spectrum in the UK.

The Commodore 64, on the other hand, was, on paper, superior:

  • OK, BASIC wasn't. It was the same version as the VIC 20.
  • 64k of RAM. Now we're getting somewhere.
  • A mix of graphics and text modes, including a “better than ZX Spectrum” mode which used a similar attribute system for 8x8 blocks of pixels, but had a resolution of 320x200 and which supported sprites. And programmers could also drop the resolution to 160x200 and have four colours per 8x8 cell.
  • A great keyboard
  • A dedicated sound processor, the famous SID
  • Loading and saving from tape.
  • That weird serial implementation of IEEE488 that the VIC 20 had, with the bug removed... but a with a twist.
  • Cartridge, and a port for hooking up a modem. And a monitor port. And, well, ports.

So if the C64 was so much technically better, why the schoolyard arguments? Other than kids “not knowing” because they didn't understand the technical issues, or wanting to justify their parents getting the slightly cheaper machine? Well, it was because the details mattered.

  • Both systems had BASIC, but Commodore 64 BASIC was terrible.
  • The extra 16k of RAM was a nice to have, but in the end both machines were in the same ballpark. (Oddly the machine in the UK considered to be superior to both, the BBC Micro, only had 32k.)
  • Programmers loved the 160x200 4 colour mode. It meant there was less “colour clash”, an artifact issue resulting from limiting the palette per character cell. But oddly, the kids were split on that. Most preferred higher resolution graphics over less colour clashing issues. So even though the Commodore 64 was superior technologically, it was encouraging programmers to do things that were unpopular. One factor there was that most kids were hooking up the computer to their parent's spare TV, which was usually monochrome.
  • The keyboard really didn't matter, to kids. Especially given the computer was being used to play games, and Sinclair's quirky keyword input system and proto-IDE was arguably slightly more productive for BASIC programming than a “normal” keyboard in a world full of new typists.
  • Both computers loaded and saved from tape, but the Spectrum used commodity cassette recorders and loaded and saved programs at a higher speed, around 1500bps vs 300bps.
  • The IEEE488 over serial thing was... just not under consideration. Floppy drives were an expensive luxury that didn't take off until the 16 bit era in the UK when it came to home computers. But, worse, the Spectrum actually ended up being the better choice if random access storage was important to you. Sinclair released a system called the ZX Microdrive, similar to the American stringy-floppy concept (except smaller! Comparable to 2-3 full size SD cards stacked on top of one another), where the drives and interface for the Spectrum came to less than 100GBP (and additional drives were somewhere in the region of 50GBP.) The Commodore floppy drives, on the other hand, cost 300-500GBP each. Worse, they were slower than they'd been on the VIC 20 (about as slow as the cassette drive no less!), despite the hardware bug being fixed, because the computer couldn't keep up with the incoming data.
  • Cartridge ports should also have been a point in Commodore's favour, but for some reason cartridges were very expensive compared to software on tape. (I didn't learn until the 2000s that cartridges were actually cheaper to make.)
  • The other ports were for things kids just weren't interested in. Modems? In Britain's metered phone call system they just weren't going to be used by anyone under the age of 25. Monitors? TVs are cheaper and you can watch TV shows on them!

Over time many of these issues were resolved. Fast loaders improved the Commodore 64 software loading times, though the Spectrum had them too. But in the mean time, the kids didn't see the two platforms as “Cheap Spectrum vs Technically Amazing C64”, they were seen as equals, and to be honest, I don't think it was completely unfair in that context they were seen that way. There's no doubt the C64, with its sound and sprites, was the superior machine, but the slow cassette interface and expensive and broken peripheral system undermined the machine. As did programmers using features the kids didn't like.

Go across the pond and, sure, nobody would compare the TS2068 with the C64. Americans weren't using tape drives with their C64s. But I'm still not sure why they'd compare the TS2068 to the VIC 20 either.

The Spectrum benefited from its fairly lightweight limited spec. Not only did it undercut the more advanced C64 on price, it also meant it didn't launch with as many unsolvable hardware bugs. The result was Sinclair and third parties could sell the add-ons needed to make the Spectrum equal or better its otherwise technically superior rivals, and the entire package still ended up costing less. In the mean time, the feature set on launch was closer to what the market – kids who just wanted a cheap computer to hook up to their parent's spare TV set to play games – wanted.

All of which said, the TS2068 probably didn't fail because Americans were comparing it to the VIC 20, so much as it being released late and the home computer market being already decided by that point. Word of mouth mattered and nobody would have been going into a computer store in 1984 undecided about what computer to buy. Timex Sinclair had already improved the TS2068 over the Spectrum by adding a dedicated sound chip, and could have added sprites, and maybe even integrated the microdrives into the system, and fixed the keyboard, and not added much to the cost (the microdrives were technologically simpler than cassette recorders, so I suspect would have cost under $10 each to add) and the system would still have bombed. It was too late, the C64 and Apple II/IBM PC dominated the popular and high ends of the US market respectively, there wasn't any space for another home computer.

Finally set up Writefreely to do my long form blogging which, hopefully, will mean I can write longer stuff of the type most people will skip over. Once I figured out why it didn't work the first time, it seems to work fine. My own platform is one I want to share with friends so there are multiple complications: it's behind a reverse proxy, and I'm using Keycloak to supply SSO.

The only issue I have with what I've configured is that registration is still a “process”, you don't automatically get dropped into the system the first time you log in with openid-connect.

For those interested, my Keycloak OpenID-Connect configuration required the following:

[app]
...
single_user           = false
open_registration     = true
disable_password_auth = true

[oauth.generic]
client_id          = (client id from Keycloak)
client_secret      = (Client secret from Keycloak)
host               = https://(keycloak prefix)/realms/(realm)
display_name       = Virctuary Login
callback_proxy     = 
callback_proxy_api = 
token_endpoint     = /protocol/openid-connect/token
inspect_endpoint   = /protocol/openid-connect/userinfo
auth_endpoint      = /protocol/openid-connect/auth
scope              = profile email
allow_disconnect   = false
map_user_id        = preferred_username
map_username       = preferred_username
map_display_name   = name
map_email          = email

In the above (client id) and (client secret) are from the configuration I set up in Keycloak's client configuration for WriteFreely. For the Keycloak prefix, if you haven't reverse proxied the /auth part of Keycloak URIs away, then you'll need that part to look something like domain/auth, otherwise just domain, eg:

host = https://login.example.social/auth/realms/example/
host = https://login.example.social/realms/example/

In terms of use, I'm still getting used to Writefreely. The formatting takes some getting used to, it's a mixture of raw HTML (the fixed font blocks above are in HTML <PRE> tags) and Markdown. In theory Markdown supports fixed font blocks too, but I can't get it to work. The fact you can always resort to raw HTML is good though, and only an issue if you actually need to use < anywhere...

One other thing, for some reason Keycloak's installation instructions include this block in their example reverse proxy configuration:

location ~ ^/(css|img|js|fonts)/ { root /var/www/example.com/static; # Optionally cache these files in the browser: # expires 12M; }

This breaks everything. Either remove it, or introduce some smart caching for those paths. Another default configuration snafu is that the built in configurator has Writefreely listening on localhost if you tell it you're using a reverse proxy, but there's absolutely no reason for it to assume the reverse proxy is on the same computer. So when you edit your config afterwards, change “bind” from localhost to [::] if you're using an external reverse proxy.