• Updated 2023-07-12: Hello, Guest! Welcome back, and be sure to check out this follow-up post about our outage a week or so ago.

I built a ChatGPT client for Mac OS 9

greystash

Well-known member
I only have good things to say about RealBasic, building this was very straightforward. The only trouble was finding remnants of documentation etc. I even read a physical book!
 

greystash

Well-known member
Update: LegacyAI 68k v1.0 is now available here. It is only compatible with 68020 CPUs and above. Do not run it on older CPUs or your system will crash. There have also been some improvements including the following:
  • Improved language support
  • Expandable window
  • Better network status visibility
These features will make it to the other builds when I have time.

If you come across any bugs please post them here.

Enjoy!
 

3lectr1cPPC

Well-known member
I’ve gotta try this out on my 3400c soon! Would love to give the 68k version a go but I’ve been stumbling through getting the Ethernet bridge function of my PiSCSI to work so none of those systems are online yet… may have to create a thread on that soon.
 

Nixontheknight

Well-known member
Update: LegacyAI 68k v1.0 is now available here. It is only compatible with 68020 CPUs and above. Do not run it on older CPUs or your system will crash. There have also been some improvements including the following:
  • Improved language support
  • Expandable window
  • Better network status visibility
These features will make it to the other builds when I have time.

If you come across any bugs please post them here.

Enjoy!
the 68k version works well on my LC III, now to get some parts for a disk-on-module for my 6200 so I can test the PPC version on it
 

Nixontheknight

Well-known member
I've just released v1.3 which has support for GPT-3.5. This is one of the best AI models, so responses should be a lot better now!
Next update I will focus on bug fixes and looking into GPT-4
Finally got my 6214 working, LegacyAI works, but longer responses like asking it to write articles seems to make it time out
 

CC_333

Well-known member
Even though I'm an AI skeptic, I have to say that this is a pretty neat idea!

That said, given that it timed out for @Nixontheknight when longer responses were attempted, maybe it would be worthwhile to make the timeout user adjustable, since maybe others with longer responses might encounter the same timeout?

Either that, or just make the timeouts longer....

c
 

greystash

Well-known member
I've just released version 2.0 which includes many new features and bug fixes.

@Nixontheknight @CC_333 OpenAI's recent changes have resulted in slower API response times, and Heroku (where I host the proxy server) has a max timeout of 30s..
Unfortunately I can't change this, and I'm unsure whether moving to another platform will be worth the added cost to me. I'll add some retry logic with the next update, but if it continues to be a problem I'll look into it again.
 
Top