Tuesday, December 2, 2025

Reverse Engineering the SGI Indy Monitor Detection, or "thank god someone added SGI indy / indigo 2 support to MAME"

 I have a bit of a soft spot in my heart for the SGI Indy and (purple, not teal, heh) Indigo 2.

So imagine my surprise when NetBSD "almost" booted just fine on the Indy I have acquired. R4600PC-100, XL8 graphics .. and wonky console colours in netbsd / wonky xorg.

The first deep dive is "why are the monitor colours so unpredictable?" and that got me into a fun deep dive into how the SGI Indy Newport graphics works, the whole SGI Indy Linux project circa 2000, hardware shortcuts and software shortcuts.

Anyway.

The TL;DR is here - https://erikarn.github.io/sgi/indy/monitor_detection  . I've listed the monitor resolution/refresh rates the internet and my reverse engineering based on what MAME was programming.

So the long version.

First up - I've put all the hardware documentation I've found so far at https://erikarn.github.io/sgi/indy/notes

The Indy was booting NetBSD in either correct colours - green kernel text, white userland console text - or incorrect colours - green kernel text, but blue console text. It was random, and it was per boot. X11 was no better - sometimes it had the correct colours, sometimes everything was wonky.

The NetBSD console code tries to setup the following things for 8 bit graphics mode (which is used for console, even for 24 bit cards):

  • Program in an 256 entry colourmap table, matching what the NetBSD RGB 332 colour scheme is;
  • Add in a 1:1 RGB ramp in another colour table (RGB2);
  • A bunch of "XMAP9 mode" lines mapping 32 entries of "something" to RGB8 pixel format, RGB2 colourmap.

I was very confused as to what was and what should be going on, and I don't want to dig into the journey I took to get here. But the TL;DR here is that everything in the NetBSD console setup path is wrong and when it "worked", it ended up with the wrong colours. And when it "didn't work", it sometimes ended up with the wrong colours.

I'll write a separate post later about how the whole newport graphics system holds together, but fixing this requires a whole lot of driver changes to correctly program the hardware, and then some funky monitor timing specific programming.

The 13W3 port on the Newport graphics boards have a 4 bit monitor ID which compatible monitors will output. There's more details available at https://old.pinouts.ru/Video/sgivideo_pinout.shtml


The "universal 13W3 interface input cable" that I bought has a bunch of DIP switches controlling this.


 

If you have the four monitor ID bits set on or off, then you still get 1024x768 @ 60Hz.

The "fun" part of this story is if I were using 1280x1024 straight off the bat then I'd likely not have seen these problems happen so often.

Anyway.

Depending upon the settings, the Indy will boot with a bunch of different possible monitor setups:

  • 1024x768, 60Hz
  • 1024x768, 70Hz
  • 1280x1024, 60Hz
  • 1280x1024, 72Hz
  • 1280x1024, 76Hz

I enumerated this list and threw them up on the monitor detection link at the beginning of the article.

So, the firmware reads these four bits at boot (via 4 IO bits on one of the CMAP chips - again, see the links at the top of the post) sets up the monitor timing and then displays stuff. But when NetBSD's console programming is getting the colours wrong when I'm using 1024x768 60Hz.

It turns out that the XMAP chips - which handle the final mapping of incoming framebuffer pixel data to what 24 bit RGB is sent to the CMAP chip and then the RAMDAC -  were being programmed inconsistently. (again, they were being programmed incorrectly too in NetBSD, but I've got a big diff to fix that. With that, they're programmed correctly inconsistently.)

There's a "display control bus" that the Newport raster chip (REX3) has to peripheral chips. The peripheral chips - the XMAPs, the VC for timing, the RAMDAC, the CMAP for 8/24 bit colour table mapping - they're all DCB peripherals. The DCB has some address lines, 8 data bits, programmable chip select line, chip select setup, hold and release timing, optional request/ACK signaling, and register auto-increment functionality.

However!

  • The REX3 chip runs at 33MHz;
  • The XMAP chips run at 1/2 the pixel clock (they're interleaved);
  • The DCB has support for explicit ACK signaling from the peripheral, but as long as the peripheral uses it;
  • The XMAP does not have an ACK line, just an incoming chip select line, and
  • When writing the XMAP mode table lines - which map the display information to pixel format / colour table selection - it's done as back to back bursts to the same register, not an auto-increment and NOT using an ACK line.

This means that if the XMAP chip is running at a speed that doesn't entirely line up with the programmed chipselect timing, the mode writes will be junk. The normal 8 bit read/writes are "mostly fine" as they just show up as multiple 8 bit read/writes to the same register and for all the OTHER registers that is just fine. But for the mode register - where the DCB needs to write 4 bytes to the same individual address - it's absolutely not fine.

And that's the kicker.

After spending some quality time with the MAME emulator and some local hacks to enable the newport peripheral IO logging and setting the monitor ID, I found out that the timing used for the XMAP chips is different for 1024x768 60Hz versus 1280x1024 76Hz.

Everything worked just fine when I adjusted it.

So ok, I went back to the Linux and X11 drivers to see what's going on there, as I know the C code wasn't doing this. And I found this gem in the Linux newport.h header file:

 static __inline__ void
xmap9SetModeReg (struct newport_regs *rex, unsigned int modereg, unsigned int data24, int cfreq)
{
        if (cfreq > 119)
            rex->set.dcbmode = DCB_XMAP_ALL | XM9_CRS_MODE_REG_DATA |
                        DCB_DATAWIDTH_4 | W_DCB_XMAP9_PROTOCOL;
        else if (cfreq > 59)
            rex->set.dcbmode = DCB_XMAP_ALL | XM9_CRS_MODE_REG_DATA |
                    DCB_DATAWIDTH_4 | WSLOW_DCB_XMAP9_PROTOCOL;
        else
            rex->set.dcbmode = DCB_XMAP_ALL | XM9_CRS_MODE_REG_DATA |
                        DCB_DATAWIDTH_4 | WAYSLOW_DCB_XMAP9_PROTOCOL;
        rex->set.dcbdata0.byword = ((modereg) << 24) | (data24 & 0xffffff);
}

It's choosing different DCB timing based on the pixel clock. It lines up with what I've been seeing from MAME and it adds a third one - WAYSLOW - which I bet I'm only going to see on the PAL/NTSC timings or if something really wants to do something like 1024x768 50Hz.

The timings are in the header file, but .. nothing is using xmap9setModeReg(). It was likely copied from some internal SGI code (the PROM? X server? Who knows!) as part of the code bring-up but it was never used.

Anyway! With this in the NetBSD console code the console finally works reliably in all the modes I've tested. I'm going to try and get my big diff stack landed in NetBSD and then I'll work on the X11 newport code so it too supports 8 and 24 bit graphics at 1024x768 reliably.

  • Read the CMAP1 register (and PROM on SGI Indy) to determine the monitor type
  • The default monitor on SGI Indy is 1024x768 60Hz, and for Indigo2 its 1280x1024 60Hz
  • Select an XMAP9 mode DCB timing set based on the pixel clock
  • 8 bit mode for console and X11 needs the colour index table programmed into the CMAP at CI offset 0, and appropriate XMAP config for the display mode table to use 8 bit pixels, PIXMODE_CI, offset 0, NOT 8 bit RGB
  • 24 bit mode for X11 needs the 24 bit RGB ramp programmed into the CMAP RGB2 table (which is not a colour index table!), and no CMAP
  • Importantly, the X11 server uses truecolour for 24 bit mode, and pseudocolour / colourmaps for 8 bit mode, so all of this need repeating in the X11 server code! 

Here's how the console looks, complete with the correct XMAP9 mode table:

And here's how x11 looks:


 

 

(And the X11 support is even more fun, because I had to fix some stuff to make acceleration in the driver work, and that's going to be a whole fun DIFFERENT post.)

Addendum:

Oh, and the sync on green? It's generated by the RAMDAC. Once this all has landed in NetBSD I'm tempted to try to add a sysctl / boot parameter to disable the sync on green bit so normal monitors work on the SGI Indy. Let me know if you'd like that!