This page is a wiki. Please login or create an account to begin editing.


41 posts / 0 new
Last post
mossy_11's picture
Offline
Joined: 2009 Sep 5
The rise and fall of Ambrosia Software

Of interest to many folks here — back in October I gave an hour-long talk at PAX Aus about the history of Ambrosia Software, from its earliest beginnings through to last year's official closure.

The audio, slides, and script for that talk are now all available via my games history podcast website at https://lifeandtimes.games/episodes/files/pax-aus-19-ambrosia-sw-talk

I'll be continuing to research their history in the new year, with more interviews with former devs and office staff and more digging into other stuff related to their games, for eventual inclusion in the second volume of The Secret History of Mac Gaming.

It's also worth noting that I have two past episodes from the show that are in the realms of Mac gaming history — one, recently, that told the amazing story of how Silicon Beach Software got digital sound and voice samples in Airborne and Dark Castle, and the other, ages ago, on Ambrosia's "eek a bug!" marketing stunt (where PR guy Jason Whong declared he would — and indeed then did — eat actual bugs at Macworld Expo if any Ambrosia games shipped with bugs that year). And in February I made a sort of audio collage of interview clips about the impact of the original Macintosh on games.

Comments

m68k's picture
Offline
Joined: 2016 Dec 30

Very nice Blog with lots of insights!

For an outfit so hellbound on shareware & digital only delivery I find Ambrosia's registration system - at least for the Classic Macs - woefully inadequate.
I like the looks of their programs, but the registration hurdle stands at the very beginning of the software experience.

Jatoba's picture
Offline
Joined: 2018 Apr 16

Nice stuff! I can't believe that PR guy pulled off such a marketing stunt... epic. Laughing out loud

Registratrion in general has always been a pain for almost any program, but it is somewhat understandable. I think anything that isn't DRM or similar is okay, as far as registration goes. (Or anything that doesn't require "relicensing" of your purchased apps every month and/or use of some external validation server).

Nonetheless, I miss Ambrosia's OS 9 & earlier era. They were at their best then.

@Mossy, will you also try to research other companies and developers? If so, I'd suggest trying contacting what was left of Fantasoft, whose most famous (& in my opinion greatest) game is Realmz. It was originally created by Tim Phillips, but for years, it seems NO ONE can get ahold of THAT guy, not even for comment. Even former Fantasoft staff can't reach him, such as Skip Meyer and Spoonlard (AKA David Riley). All we know is that he's alive and minding his own business. Kinda like kRat (AKA Kyle Ellrott) and Mac demoscene (assuming this is him, plus this, this, this and this).

mossy_11's picture
Offline
Joined: 2009 Sep 5

Yep, I am and have been researching all across the Mac gaming scene of the 80s/90s/early 2000s. I got lots of info on Silicon Beach, Freeverse, (early) Cyan, Bungie, ICOM Simulations, Cliff Johnson, and John Calhoun, among others, when I was writing my book.

I tried to find Tim Phillips a couple years ago and couldn't find any way to get in touch, but I will definitely try again next year and I'll aim to interview anyone I can who worked with him at Fantasoft as well. Likewise with Parsoft founder Eric Parker (Hellcats Over the Pacific, A-10 Attack) — nobody could give me the slightest indication where he might have gone when I was doing my book interviews. I have more tools and techniques for finding people now than I did then, so maybe I'll get lucky.

I have a big list of people/companies/games to research further, but if anyone has requests for others to try to track down just let me know and I'll make sure they're on there.

Jatoba's picture
Offline
Joined: 2018 Apr 16

When you do try to reach Tim Phillips & others, my recommendation is to start with David Riley. He's been tasked with porting Realmz to Carbon (& beyond), fixing the game's bugs, all the while keeping Mac OS 9 & earlier compatibility, as well as PPC and Intel Mac OS X. He is legally bound not to hand over that project to anyone or let the source leak, but I'm not sure who has him bound. Skip Meyer? Tim Phillips himself?

He's a very busy man, though, but if you try to reach out to him via Facebook, he should eventually reply. He's also a member of the Realmz Facebook group called "Realmz Castle".

Once every few months I message him to remind him of "Carbon Realmz", because he constantly requests that "we" remind him of it, so it doesn't get forgotten by him, so likewise I'm sure he wouldn't mind being approached for a Realmz-related interview.

Elyus's picture
Offline
Joined: 2009 Aug 9

Great talk! Thanks for sharing it, and I have really enjoyed your Life & Times of Video Games series.

I'm thrilled to hear about an upcoming second book since I liked the first one a lot. Being a huge fan of Ambrosia, any more details about their history would be welcome! I also wonder if you've thought about a section on Blizzard Entertainment? For a time, they seemed about the only major developer committed to their own in-house ports of Mac games, and I was always curious about their Mac division.

mossy_11's picture
Offline
Joined: 2009 Sep 5

Thanks Elyus!

I actually went back and forth on whether to have a section covering Blizzard's commitment to doing Mac versions of everything in the first book, but ended up deciding that there was no room to do it the way I wanted. So that's definitely something I'm keen to put into the second book.

I remember well the feeling in the late 90s that pretty much every commercial PC game company had abandoned us except for Blizzard and the few publishers that still allowed Mac porting houses to bring (a selection of) their games lineup over.

m68k's picture
Offline
Joined: 2016 Dec 30

Will you be doing a piece on the ill fated PowerCPU centered cooperation between IBM and Apple?
When "scientific" wisdom had convinced everyone but Intel & Microsoft, that the days of CISC were numbered and RISC computing was the only way left to go.
I wonder what the tech world might look like today, if not for that earth shaking blunder of those two powerhouses, that used to dominate the IT landscape, before WinTel came along and swept them all aside.

Jatoba's picture
Offline
Joined: 2018 Apr 16

That's... not how the story went, in a number of ways. Things to note:

- Intel got "threatened" by Microsoft during the 90s regarding supporting competing systems;

- Windows did ship on PowerPC, among other architectures, which is partially related to the previous point, depending on how things went. Microsoft never had a "CISC commitment" for the sake of CISC itself;

- RISC is inherently superior in most ways (IBM was the first company in the world to learn this the hard way, with the first RISC processor destroying the reputation of their CISC mainframes despite the competitor's much lower resources and expertise in that market), and part of the way Intel/AMD managed to compete was to make them more "RISC-like" internally (starting with Itanium), with shorter pipelines etc. (look how horribly Pentium 4 sucks even at crazy high clock speeds). Though the truth isn't so black and white that we can say "RISC/CISC is better", but rather it comes down to processor specifics (and it gets really specific the more we look into it). But the rule of thumb of using load/store and higher parallelism for increasing performance is very true and very much applied.

- Motorola's main market was embedded-class processors (low power, but has more noticeable performance limitations) and IBM's was server-class processors (incredible performance, but very high power usage for mere "desktop" tasks). Among the three, Intel (and AMD) was the only one with a desktop-oriented offering (requirements-wise) during the early 2000s, while Apple wasn't IBM's/Motorola's no. 1 focus, sadly. Has nothing to do with CISC/RISC.

- PowerPC (and thus RISC) is still insanely powerful to the present day, verifiedly even more so than Intel/AMD/ARM, with Google making a big purchase because of that not too long ago, but even for desktops/workstations, check out POWER9 and Talos II.

- ARM is still on the rise, and it's a RISC processor family. Apple is supposedly even going to switch to it anytime now.

I don't think so much would be too different if IBM/Motorola had taken over the desktop market. More than anything, it'd be a question of what architecture Windows would be running under that would determine the most successful processor, because of a combination of its price, accessibility and ease-of-use.

Nonetheless, a write-up on the whole subject would be very interesting. However, to do it really well, serious research would have to be done into processor specifics from the 1970s till the modern days, and I think that would be a bit too out-of-scope for a Mac-game-history-oriented book (and the story would still end with RISC, with the ARM switch).

Jatoba's picture
Offline
Joined: 2018 Apr 16

For the record, here's an interesting read that discusses how Intel processors switched to RISC internally to achieve higher performance, and that the CISC instructions are nothing but a shell to the real, RISC ones:

Why does Intel hide internal RISC core in their processors
(Didn't start with Itanium, though, that was a mistake on my part.)

So we already live in a purely RISC world, even more so than I thought, and it's really RISC that came along and "swept CISC aside", WinTel included.

m68k's picture
Offline
Joined: 2016 Dec 30

Sorry folks, but I have to disagree on this one. As a systems admin I also installed and maintained conmercial servers with the RISC version of Microsoft's operating system.
From the very onset MS was treating that version like a bad afterthought. Patches and updates were delivered late and even among the basic tools only a subset had been ported to RISC.
Microsoft only developed a RISC version of Windows because of existing commitments - they never made an effort to support that platform full throttle.

And about the RISC microkernel contained in Intel's CISC CPUs: That's nothing else but a hybrid powered engine. Its *not* "pure RISC", as it utilizes that technology as an enhancement instead of a replacement of CISC.

But the basic point - the alliance of Intel and Microsoft versus a lonesome IBM who only had a shaky deal with Apple to show for their PowerCPU technology - is part of the historic record:
https://en.wikipedia.org/wiki/AIM_alliance

Jatoba's picture
Offline
Joined: 2018 Apr 16

When the CISC instructions are purely a front for RISC instructions, it is pure RISC. The CISC aspect of it is gone: it's only an emulation layer to be translated for the sake of compatibility with older software that already uses it, and new software that will keep using it, to keep things universal for "x86" processors. The instruction set is CISC, but the processor is RISC, both in behavior and actually-executed instructions.

SkyCapt's picture
Offline
Joined: 2017 Jan 11

Cisc cannot become Risc. Internet is full o shit while they keep saying there's a Risc engine in intel CPUs. There's a Risc engine in all Cisc CPUs, intel isn't special. Cisc instructions get broken down into a set of steps in which each step behaves internally as Risc instructions, but like all increased complexity with an added abstraction layer, Cisc takes a performance hit. Funny thing is, Risc is the most straightforward natural way to do CPU. Risc came first, and then Cisc was built on top of it. History makes it seem the other way around, chip makers jumped into Cisc right away (mainly due to high cost and small size of ram) and Risc was later offered up like something new and more high-tech when Risc is faster but really more primitive and been around.

Jatoba's picture
Offline
Joined: 2018 Apr 16

Not all """CISC""" CPUs have RISC cores in them, such as the case of older Intel, Motorola 68k and other (also generally old) processors, which are truly CISC (and worse).

The internet, and also elsewhere beyond the internet, is indeed full of shit, but not in regards to stating the fact that Intel's processors today are all RISC inside, and this wasn't the case until PowerPC and other RISC processors (SPARC, MIPS etc.) started ruling over performance-wise, and that the CISC instruction set is translated in real time to RISC ones, because CISC is not directly used, because the processor is precisely not CISC.

Also, what do you mean, "RISC came first"? RISC, at least as we know it, came later and was a technological breakthrough when it did. The term "CISC" supposedly was later retroactively invented to describe every other processor that wasn't RISC, both before the "first RISC" and after. (Not that CISC/RISC's timeline relates to their respective performances, though.)

SkyCapt's picture
Offline
Joined: 2017 Jan 11

I know what you mean. Take the 6502 (Apple //e) - internally it had a lot of PLA (logic array) less CISC but its design can be duplicated using cisc architecture. (wasn't the model number 6502 derived from the approx number of transistors, same as the initial 68000 chip?)

But yeah, risc came first. I've designed cisc cpu at university, knowing what each transistor is for. Risc demands that every instruction gets completed in ONE cpu cycle. Since only a few instructions are even capable of completing in one cycle, the result is a cpu with fewer instructions than typical, hence the name risc = reduced instruction set to describe this design. They had to walk on and step over the risc architecture in order to create the cisc architecture. In the 1970s the cpu industry leapfrogged over risc and made cisc, because of the characteristics of ram at the time cisc was faster than risc. Risc is constantly streaming instruction thru ram, whereas cisc is a computer inside a computer, cisc doesn't really have "instructions" at all, I call them tokens instead. The cisc cpu imports a token or two from ram and then the cisc cpu "chews" on those tokens for up to many many clock cycles, therefore slow ram is better when paired with cisc. In the early 1990s speed of ram improved enough to make risc defeat cisc.

The intel design cannot outperform PowerPC, cisc cannot beat risc. Take this example of the unavoidable performance hit: imagine cisc instruction "X" has its final step be "move ia,ib" (copies internal "a" register into internal "b" register), and instruction "Y" has its first step be "move ib,ia". Every time the cisc cpu executes instructions X followed by Y, there will be a wasteful cpucycle in between. The risc cpu will instead use an intelligent COMPILER to screenout that nonsense.

We shouldn't rip-off this thread, you want to start a new one, about adopting arm on desktop or other cpu design?

Jatoba's picture
Offline
Joined: 2018 Apr 16

That was interesting insight, I appreciate it. My understanding of processors doesn't go that deep at all, though, sadly, so I don't think I have much to add to that specific discussion. However:

We shouldn't rip-off this thread, you want to start a new one, about adopting arm on desktop or other cpu design?

Why not? That sounds interesting, even if I may not be able to contribute further to the discussion, because I'm very interested in listening to more about it. The thought of RAM speed impacting processor designs was a completely alien concept to me (which goes to show just how much I don't understand what actually goes on in the bare metal yet, as someone previously only used to "very high level abstraction" languages).

SkyCapt's picture
Offline
Joined: 2017 Jan 11

I meant we got off topic and need to put these msgs in another thread (of which I started new thread already).

Jatoba's picture
Offline
Joined: 2018 Apr 16

I forgot to address one point earlier:

But the basic point - the alliance of Intel and Microsoft versus a lonesome IBM who only had a shaky deal with Apple to show for their PowerCPU technology - is part of the historic record:
https://en.wikipedia.org/wiki/AIM_alliance

That link only discusses the launch of the AIM deal, plus that Jobs and Motorola's CEO didn't hit off well (Jobs joined only half a decade later after the deal). Though it does correctly mention the alliance "yielded the launch of [...] Apple's highly successful Power Macintosh computer line", which can be described as anything, but "shaky".

Thanks to that deal, unprecedented computing power was brought to desktop computers (before, it was available only to servers and high-end workstations), whose peak was definitely with the G4 and its inclusion of AltiVec, during 1999 and the early 2000s. AltiVec, which is a form of SIMD, completely obliterated Intel's equivalent of the time, called MMX, which pushed them to develop SSE instead, which every WinTel user today can be thankful for.

From renowned assembly developers behind Fantasm, here is what they had to say back when AltiVec was new:

Altivec versus MMX?

We mention MMX(tm) for it has (allegedly) similar capabilities. It turns out that Altivec is superior on several counts, but this isn't (or shouldn't be) just because of speed. If it is not included on every G4, then the usage of Altivec will be limited to niche markets. If however, it will be available on every G4 Mac, then the potential applications are limitless.

MMX(tm) although better than nothing has several current problems: it is built on a legacy platform, is a limited implementation, provides an extremely complex problem for compilers to use these extensions effectively. It kills FPU registers, and prohibits dual use of FPU unit along with MMX instructions.

Altivec however is built to a full specification; indeed, it is a superset of MMX. It allows all execution units to dispatch and be used simultaneously assuming a well scheduled instruction stream. It provides 32 new registers and many more instructions than MMX's basic 57. It also provides some very specific "media processing" and digital signal processing instructions that MMX lacks.

Conclusion: MMX does not compare well to AltiVec.

The only part of the whole deal that was shaky was the G5. As magnificent as the processors technically were (and still are), they were just not fit for Apple's desktop and especially laptop performance-per-watt demands. Its AltiVec implementation was also rushed (slapped into a POWER4) and nowhere near as masterful as the G4's, meaning at the same clock speed etc., it had much worse AltiVec than the G4. It still output faster AltiVec, but precisely because of higher clock speed, L2 cache etc., and not due to processor design merit.

At that point, it was obvious Apple had best switch to a vendor that still moved forward with desktop processor technology, and not server (IBM) or embedded systems (Motorola AKA Freescale AKA NXP), both of which weren't committed enough with Apple anymore. Though funnily enough, that was when Microsoft switched to PowerPC, with their Xbox 360 highly successful console (with a tri-core 3.3GHz G5-derived processor called Xenon).

Sooo... point is, this whole "IBM deal sucked", or "CISC rulez" or "WinTel swept all aside" or "Microsoft stuck with CISC/Intel" is not how it went. It was key to Apple's success, and served them very well during its time, until the Intel switch, and now we have an ARM switch on the way (both Apple and, possibly, even Microsoft with Windows). And who knows what will come after that?

SkyCapt's picture
Offline
Joined: 2017 Jan 11

MMX was what, 1995. Altivec had four years to learn from their mistakes. Intel even hated the name. Matrix Math eXtension (extended instruction set) was too easy, we hate easy, so they wanted to say it stood for Multi Media. Then they killed off that name, began using "SSE" streaming simd, that's better, comfortably numb. Wait, "SSE2" ? 3?

I saw it coming. I wrote an algorithm which performed matrix math inside a loop (it's one way of doing 3D depth perception). During cross multiply, several terms kept cancelling out and other terms were always zero, so I knew the algorithm could be streamlined using dedicated hardware. This was the birth of what we call the GPU. Fast forward, and I've gotten my Power Mac to use its Altivec as substitute for gpu, in the 3D game Ratatouille. It plays an ok fps, and a thermometer probe on the gpu reveals the gpu is offline during gameplay.

Jatoba's picture
Offline
Joined: 2018 Apr 16

Just a side note (and I'm sure you know this very well):

This was the birth of what we call the GPU.

Separate, dedicated graphics processing units (GPUs) were long featured in all sorts of computers & game consoles much before both MMX and AltiVec appeared, as early as the late 80s, if not earlier.

About MMX & AltiVec, the point is that Intel was pressured to improve, not that AltiVec coming out at the top was a surprise (it wasn't). It's the usual "free market" thing of competition leading to improvements. Like AMD vs. Intel, or all processor manufacturers against each other, or OS vendors, or any other technology or product.

SkyCapt's picture
Offline
Joined: 2017 Jan 11

Those 1980s graphical coprocessors were 2D, those machines barely had the oomph to do low res wireframe 3D. So opposite the manner I throw around the terms risc & cisc on dates falling before said terms were in use, I didn't mean "gpu" in that manner, I meant "standardized 3D GPU" sorry for any confusion.

m68k's picture
Offline
Joined: 2016 Dec 30

MMX was halted by Microsoft, because they warned Intel that, if they enabled OpSys independent hardware accelleration on their platform for MS's competitors, then Microsoft would open itself up to Intel's competitors. So this entire flirting with the PowerCPU was indeed just that: Microsoft trying to make a point towards Intel and doing the min. legally required to fullfill old commitments from their days of cooperation with IBM.

Microsoft *never* intended to support the Power Platform full throttle and why should they?
IBM had planned that move to the PowerCPU precisely to rid themselves of Microsoft (=>PS/2). At this stage John Sculley had taken over from Steve Jobs and that dude wanted to sell Apple out to Big Blue almost from day one.

And the xBox didn't hit the market before 2002, by that time Microsoft had won the OpSys war and could afford to pick and choose.
I am not saying that RiscCPUs don't have their advantages - but IBM & Apple gave up on the CISC technology way too soon. Intel adding a RISC kernel to their CISC CPUs doesn't change the fact, that their design won the competition.

I just don't believe that Wintel's monopoly made the computing world more colorfull. Rather it left us stuck with a dinosaur approach that is only now being challenged by ARM's low power CPUs.
And yes, I dream of the day when the 68880 will come to the rescue of us all - with tile free Platinum 8.99 Wink

BryMD's picture
Offline
Joined: 2018 Jul 2

Hi m68k. Sometimes, just sometimes, its better to own up to errors in rationale - rather than to continue beating a dead horse. RISC never went anywhere. RISC/PowerPC has been alive and well, powering most generations of both Sony and Nintendo gaming hardware (as well as the XBox 360) since the early 2000. And RISC/ARM is now powering almost everything in your home with a battery. If anything, it seems that CISC now is on the losing end - if there ever was a point to this comparison to begin with...

BryMD's picture
Offline
Joined: 2018 Jul 2

And a small Microsoft-tidbit on the side: Early development kits for Microsoft's XBox 360 was a standard off-the-shelf Apple Power Mac G5 Smile

Jatoba's picture
Offline
Joined: 2018 Apr 16

There's a video or keynote of some sort that I remember seeing with Steve Jobs declaring how Microsoft had become their "no. 1 Power Mac G5 customer" or something along those lines, which was pretty amusing. Smile IIRC it was thousands of them or so, as Xbox 360 dev kits.

SkyCapt's picture
Offline
Joined: 2017 Jan 11

I recall 1997 when Bill Gates's Big Head was projected on the giant screen. People booed, then some giggling once they realized he was live and listening to them booing.

Jatoba's picture
Offline
Joined: 2018 Apr 16

@m68k
The first half of your comment overall falls in accordance with what I said from the beginning (Microsoft keeping Intel on check). That detracts nothing from the previous points, and I brought that up first myself to address one of your own earlier statements.

But the second half... Really, what was that? I really don't know what you are going on about there:

- In the sense you speak of, the OpSys (Operating System) war long ended even during Apple's 68k Mac phase, way before PowerPC existed. Early '90s Windows completely ruled all over;

- Xbox 360 was from 2005-2006, not 2002. The original 2002 Xbox was an Intel-based console. (The Xbox 360 also vastly outsold it, btw);

- "by that time Microsoft had won the OpSys war and could afford to pick and choose".
I have no idea what sense this was supposed to make. Console makers may factor in a lot when picking a CPU, but "winning/losing" an unrelated operating system "war" is surely not one them. (And in 2001-2002, they still picked an Intel CPU for their console. Going by your reasoning, that would mean the "OpSys war" wasn't won yet.)

- "but IBM & Apple gave up on the CISC technology way too soon."
I humbly ask: How so? How does that make any sense whatsoever? How would history have gone better for Apple had they stuck with 68k or, as you religiously insist, any other CISC processor? Why CISC specifically?

- "Intel adding a RISC kernel to their CISC CPUs doesn't change the fact, that their design won the competition."
It does. Not that the OS competition had anything to do with RISC or CISC specifically in the first place, as noted multiple times above, but the fact remains Intel CPUs switched to RISC during the 90s, which is present in their processors to this day, and that, thus, it is factually-inaccurate to refer to them as CISC or to say Intel's design won, because its CISC design is gone. Gone way before ARM took over everything. Had it remained CISC, it would have been unable to keep up. But this point also has been explained multiple times already, so I won't reiterate it further.

- "is only now being challenged by ARM's low power CPUs." (Emphasis mine.)
That, again, is simply factually false. I politely ask you to refer to all the previous points for that.

Look, I don't mind it if you just absolutely adore 68k and all it entails, CISC included, that's not a problem. But misinformation is, especially if suggesting to put it in a book. We all are mistaken about various things throughout life. That's fine and, to a degree, unavoidable. I have been mistaken about PowerPC, Apple, Macs and much else over the years all the time, and although I always loved all of these, facts are facts, even if I did not like them. But I think it's important to slowly accept them when a lot of proper evidence thereof is provided. Before you accept it, you can (and should!) doubt it, verify it, but once confirmed, it's confirmed. It's still good to be skeptic even about things we "confirm", but only provided you got the required facts/evidence to back it up, not personal beliefs.

I say all of this with amicable intentions, at the risk of sounding arrogant, which I'm not trying to be. But I felt getting this across was more important than that risk. I genuinely mean well.

BryMD's picture
Offline
Joined: 2018 Jul 2

We all are mistaken about various things throughout life

Man, you're now touching on what scares me the most here in this life: Illusory truths! And, secondarily to illusory truths, Google that scares me to DEATH by their unchallenged capacity to manipulate the 'truth' to their own benefit - whether that be monetary, socially or politically.

Everyone knows somebody they think are easily manipulated, but no one is willing to accept that they are just as easily manipulated themselves simply because it shows weakness - and this is exactly what Google thrives on! We've basically voluntarily given the keys to our own FREE WILL to a single company (well, multiple companies counting Facebook and the likes).

*physically shivers*

PS: Know this is off point, but what was this thread about again... Oh, yeah: Ambrosia Software! Tongue

Jatoba's picture
Offline
Joined: 2018 Apr 16

It is about a cool book and a sequel book to it. Tongue This talk is techniiiiically not offtopic, because all along we were discussing a book suggestion.

Honestly, the illusory truth deal is quite a thing. (I will just try to cross over your post to bring it back ontopic. Tongue) Even in all I said, it's not like I'm any processor specialist (far from it), so although I believe I'm not technically mistaken yet, and I did long research the subject as best as I could over the years, I could very well be, and it just hasn't been properly pointed out yet. Happens all the time.

One "cool" major illusory truth is that it is perceived today that Xerox invented the GUI, when, guess what, they also stole the idea (and IIRC Apple had a deal going with Xerox, while all the others proceeded to copy Apple, like a SCSI daisy chain), but people parrot that so much that it's almost impossible to get through with them.

Now that could almost be part of the book! Except that, too, wouldn't encompass much Mac gaming (other than the history of the GUI that some Mac games used).

mossy_11's picture
Offline
Joined: 2009 Sep 5

Well, "stole" perhaps isn't a fair choice of words — given that some of the key people in Xerox PARC's Alto team had worked with Doug Engelbart in the 1960s. But Engelbart's work is definitely not as well-known today as it should be.

And yeah, sadly it's almost entirely beyond the scope of my Mac gaming history books — except, as you say, in footnotes or asides about a few GUI-related things. As is most of the processor talk above.

Jatoba's picture
Offline
Joined: 2018 Apr 16

Yeah, my bad, I forgot to add quotation marks for "stole", I didn't mean it literally. It's just that it's a popular word choice among people that discuss the whole GUI drama. Tongue

OneEightHundred's picture
Offline
Joined: 2020 May 11

Meh, sorry for necro, but one thing I keep thinking about with Ambrosia not making it that I didn't see mentioned anywhere in this is that in somewhere around '96-'97, they were working on their own first-person adventure/shooter game, Manse: http://docmaker.whpress.com/files/the-ambrosia-times-3-4/page10.html

That fell through, and IIRC they were going to publish some 3D fighter jet game that also fell through, so they went from having some of the best-produced games on Mac to completely missing the real-time 3D train except for Avara, which I think they've said was either their worst-selling or second-worst-selling game (with the other being Chiral).

Kinda wonder if they could have wound up in a situation more like Pangea, producing more modern games, but that probably wouldn't have worked out past 2006 when Apple stopped financially supporting Mac-exclusive shops via bundle deals, making it much harder for them to compete with ports.

Another sad "what could have been" contrast is Beenox, who published their first game with Ambrosia, and wound up turning into an Activision subsidiary and working on Call of Duty games.

mossy_11's picture
Offline
Joined: 2009 Sep 5

I didn't know that Beenox did the Modern Warfare remasters. Good for them still going strong 20 years on. Looks like the key people from Pillars of Garendall are all gone, though, and have been for a while.

I'm hoping to talk to everyone who worked at or with Ambrosia in the coming months to fill in all these gaps about what else was happening behind the scenes and what the stories were behind all the games (including the cancelled ones).

I guess the key difference with Pangea is Brian Greenstone. He was (and still is) a highly-skilled 3D programmer, and he had close connections to Apple, and these things gave Pangea a distinct advantage over Ambrosia — who relied on external contractors and other studios to develop games and so were limited by what deals they could make and how much money they could afford to spend on wages.

The funny thing with Ambrosia and 3D games is that one of their final non-port projects was actually the game that birthed the Unity Engine: Over The Edge Entertainment's GooBall. It sold poorly, so the creators decided to just focus on improving their engine and sell that as a dev tool instead. It'd probably be a contender for best pivot ever.

adespoton's picture
Offline
Joined: 2015 Feb 15

Mossy, please tell me you've captured the story behind the birth and demise of Avara. That game was my favourite game of all time, partly due to its hackability and novel use of Claris Works documents. That's the game whose demise was fully due to its strongest assets; I always wished Juri had just open sourced the entire thing... but I guess he couldn't, as he was using the components in other profitable enterprises as well, and had sunk years into developing it.

[edit] And yeah, eventually he DID open source it. Now we've got https://github.com/avaraline/Avara and I have no time to play it :\

mossy_11's picture
Offline
Joined: 2009 Sep 5

Not yet. I remember contacting Juri when I was working on the first book, but for some reason we never actually did an interview. He's on my list of people to talk to soon. I may also talk to a few key community members about the modding scene that formed around it, too.

adespoton's picture
Offline
Joined: 2015 Feb 15

I just compiled and ran Avara last night... doing double and rocket jumps seems way harder than it was in the 90s, but the game certainly feels the same.

Next trick is to see if I can get it running on older hardware too! Only external dependencies are SDL2 and SDL2_net (both frameworks need to be downloaded and installed from the SDL website). Hopefully I can get it running on 10.4 PPC, since they've stripped out all the original assembly code.

OneEightHundred's picture
Offline
Joined: 2020 May 11

Hmm, I thought more of their games were in-house. Looking at the credits now, I guess only their first 3 were?

They were still doing a lot of the in-game art though, which is more than the typical publisher relationship. But, the implications of that were also different: In the early 90's, high-quality art, especially on a platform with high-res, high-color displays, was the most important thing for the visual quality of a game, so something very valuable that they could provide even as a "publisher."

Late 90's though, the display advantage had evaporated and visual quality was falling much more on programmers wrangling 3D tech.

mossy_11's picture
Offline
Joined: 2009 Sep 5

I think Ferazel's Wand was technically done in-house, since Ben Spees was on staff at Ambrosia at the time. Otherwise, yeah, (if memory serves me correctly) Andrew Welch programmed their first two games (Maelstrom and Chiral) as well as their fourth (Apeiron), then for everything else the main developer was external and they just helped out with various bits and pieces internally as and when needed (be it with art, design, production, or programming assistance).

Xenon4u2c's picture
Offline
Joined: 2020 Jun 5

“The best things in life are free for 30 days.” ... LMAO !!!

nil0bject's picture
Offline
Joined: 2012 Nov 14

I have a bungie playlist on youtube you might be interested in:
https://www.youtube.com/playlist?list=PLaieH8Mjr5VyCpSiY_1fok62MXJve8zxh

where is PAX Aus held?

mossy_11's picture
Offline
Joined: 2009 Sep 5

In Melbourne, Australia.

Nice to see some of those old Bungie videos collected together in the same place for easy watching.