tines 5 days ago

I love this project.

I've been feeling lately that as computers have become more advanced and software has become more inscrutable, our relationship with our computers has changed, for the worse. This essay hit home for me: https://explaining.software/archive/transparent-like-frosted...

These old-school computers viewed their users as creators, as developers. Modern computers (read: smartphones) _are_ the users, and the "used" are just ad-watching revenue cows. I passionately hate this arrangement.

When I have children, I want them to know what computing should feel like---empowering, creative and stimulating, not controlled, consumptive, compulsive and mindless. I want to give them a computer that builds up their spirit, rather than grinding it down.

I think this computer should have several qualities:

0. The computer should be about _creation_ and not consumption.

1. The computer should be _local_, not global. Intranet should be prioritized over Internet.

1.5 A corollary, the computer should be _personal_. It should encourage and reward in-person interaction, physical sharing of information and programs, and short-range connection between computers.

2. The computer should be _limited_. Because the medium is the message, we have to restrict the capabilities of our media to better promote the messages we value.

2.5. A corollary, the computer should be _text-oriented_. Graphics shouldn't be impossible, but text should be primary. The computer should cultivate a typographic mind, not a graphic mind (in Marshall McLuhan's terminology).

3. The computer should be _focused_. It should never distract you from what you want to work on.

4. The computer should be _reactive_, not proactive. It should never give you a notification. You should be in charge of retrieving all information yourself, like a library, not a call center.

5. The computer should be _physical_. It should be oriented around physical media.

6. The computer should be _modifiable_. It should encourage and reward inspection into its internals, and be easy to change.

7. The computer should be _simple_, understandable by a single person in its entirety with time and study.

The Mega65 is amazing and checks these boxes, but unfortunately it's a tad expensive for me. What other machines are out there like this?

  • hajile 5 days ago

    Mega65 is a nostalgia project aimed at targeting a specific kind of older computer system.

    If practical and simple were the goals, it wouldn't be using an 8-bit chip nor would it be focused on things like BASIC as these kinds of things make things harder rather than easier.

    Mega65 is about working within constraints, but (to me), the bend of the complexity curve where things are simple enough to understand, but powerful enough to do necessary tasks requires much larger software and hardware resources than what a system like Mega65 offers.

    At a bare minimum, I believe you'd want a much more powerful single-core machine with the ability to do floating point and vector calculations while having access to at least a couple hundred megabytes of RAM and a few gigabytes of storage. Something more along the lines of a Pi Zero, but based on open hardware and open chip designs seems to be around the point where it is powerful enough to do all the common, non-connected tasks a user might need to do while still being simple enough that you could understand most of the moving parts if you wanted to take the time.

    • Mr_Minderbinder 4 days ago

      >If practical and simple were the goals, it wouldn't be using an 8-bit chip nor would it be focused on things like BASIC as these kinds of things make things harder rather than easier.

      A 32-bit word size is the absolute minimum that I would consider “practical”. With that you can comfortably implement reasonable floating point and integer arithmetic that can solve most of the common practical problems you encounter in science, business or engineering. Still people criticised the System/360 because it was 32-bit, saying it wasn’t enough compared to the 36-bit word size computers that preceded it. Mechanical calculators generally had ten decimal digits of precision so the thinking was that digital computers needed to at least match or exceed that.

      • kbolino 4 days ago

        Plenty of older microcomputers got by just fine with only 8 bits in a word. That's enough room to fit two binary-coded decimal digits and thus perform all the common arithmetic operations one or two digits at a time. Many quite good handheld calculators and early personal computers were able to handle practical computations very well in this way, albeit at speeds far below the billions of operations per second expected of modern computers.

        The advantage of 32-bit word size is more for interoperability with modern systems and familiarity to modern programmers and programming languages.

        • Mr_Minderbinder 3 days ago

          No one in their right mind would design a general purpose computer system from scratch with an 8-bit word size unless they were forced to. Neither Zuse nor Eckert and Mauchly ever did.

          > Plenty of older microcomputers got by just fine with only 8 bits in a word.

          They got by fine on your kid’s desk. Those processors were better suited for implementing smart terminals that people would use to interface with the computer that would actually solve your problem.

          > That's enough room to fit two binary-coded decimal digits and thus perform all the common arithmetic operations one or two digits at a time.

          Why perform them one digit at a time when you can perform them 10 digits at a time? Why force your users to adopt a programming model where you have to implement arbitrary precision arithmetic anytime you want to use a number larger than 255? Or have to implement a useful floating point in software? Or be forced to pull in libraries or use some environment were that stuff is pre-implemented in order to do basic operations that previously could be done in a single line in assembly?

          >...albeit at speeds far below the billions of operations per second expected of modern computers.

          They performed at speeds below what was expected of computers from the 50s and 60s.

          > The advantage of 32-bit word size is more for interoperability with modern systems and familiarity to modern programmers and programming languages.

          No it is not about modern computing or modern programmers. 32-bit or similar word sizes were the standard for electronic computing since the earliest times because there are intrinsic advantages. Nobody did 8-bit computing before micro-processors from the 70s because it would have been stupid to limit yourself that way if you didn’t need to.

          • kbolino 3 days ago

            But the designers of early micros did need to limit themselves, and they proved that those limits were not insurmountable. The luxuries afforded by kilowatts of electricity and cubic meters of space were beyond the reach of early micros. You may sneer at the fruits of their labors, but for a large number of people across an entire generation, these devices were accessible to them when room-sized computers weren't.

            Taking such a hostile tone over what is obviously already settled seems strange to me. Of course you don't design modern systems with 8-bit words. There are many advantages to larger words besides those I mentioned or implied. But calling the elements which were good enough for the second wave of the computer revolution completely impractical is also absurd to me.

            • Mr_Minderbinder 2 days ago

              > Of course you don't design modern systems with 8-bit words.

              Again this is not about contemporary computing or computer technology level in general, as I have already made clear. What is good or bad design for general computing is independent of those things, it is timeless. Why do you keep referring to some arbitrary “modern computing” as if that is somehow relevant when word-size in particular is a design decision that is not really dictated by technology level? It is often possible to compromise on other aspects of the design in order to have the width you want.

              What you mean to say is “you don't design any general purpose system with 8-bit words.” but you won’t because it would undermine your whole argument.

              > But the designers of early micros did need to limit themselves,

              It would have been more difficult but not impossible to create an IC with a larger word-size in the early 70s. The 8-bit word size was not entirely chosen for technical reasons but also because they were targeting a specific market, one that did not have to meet all the demands of general computing and one for which the general computers were ill suited so this made the most sense. Would you really not say they were designed for and are better suited for the subset of computing applications that is called “embedded” computing? Don’t forget that most of them were designed before the desktop computer market existed and were never intended to be used that way. You can force them to be more than what they are (and people did) and pretend that they are but it doesn’t mean you should.

              > But calling the elements which were good enough for the second wave of the computer revolution completely impractical is also absurd to me.

              Design concessions which are intrinsically bad, awkward or inappropriate do not become good because it “still worked” or succeeded in a particular market. Not that it counts for much since those computers were not marketed in the same way or purchased for the same reasons. You say it was “good enough” but a lot of people didn’t think so but I guess most of them are dead now (like Wirth). The word-size was not a selling point, and only not enough of a hindrance to upset the novelty of the overall product. What it takes to introduce people to electronic computing is not the measure of general computing.

              What is absurd is people like yourself insisting those particular machines, out of all machines, should be used to inform us as to what is reasonable for general computing when, considering the context of their creation, it is completely unwarranted.

              • kbolino 2 days ago

                Despite the disdain of ivory-tower academics who had access to resources well beyond their individual means, practical computing for the masses was only made possible with the "wrong" parts and the "wrong" number of bits and all these other "mistakes". Even today, early microcomputers (and their remakes) remain more broadly accessible than contemporaneous minicomputers.

                We're now entering an era where, despite sophisticated hardware with wide words and layered caches and security rings and SIMD units and all this other fanciness, the end-user increasingly doesn't own their computing experience. While 8-bit and 16-bit computing may not meet your arbitrary definition of sufficiency, it still provides an avenue for people to access, understand, and control their computing. If it wasn't good enough for Wirth, so what?

                • Mr_Minderbinder a day ago

                  As I mentioned in my very first post it is based on what humans want out of computing. We have ALWAYS wanted to do arithmetic with large numbers and with fractional values. We have wanted to compute trigonometric functions since antiquity. We wanted human computers to do all these things and more in the past so why shouldn’t we want the same of automatic computers now?

                  What we want out of computing is constantly evolving but in most cases the new things boil down to solving the same old set of basic problems. A large portion of modern cryptography is simply arithmetic with large numbers for instance.

                  An 8-bit machine can be used to solve these problems but often inefficiently or awkwardly, which is not the point of automatic computing, hence the impracticality of that word size in a general purpose computer.

                  Since word-size mostly determines the magnitude and precision with which we may conveniently represent a quantity it is pertinent that we consider what scales we might be expected to work at. There are 13 orders of magnitude (~43 binary orders) between a pauper with a single dollar and the nominal GDP of the USA. It is therefore unlikely your adding machine will need more than that many columns. There are 61 orders of magnitude (~203 binary orders) between the diameter of the observable universe and a Planck length. Those are the opposite ends of two extremes and we could reason that common calculations would involve a more Earthly scale. We should also consider the other natural quantities but the point is that in this world we must expect to operate at scales which have been determined by economic or natural forces. A machine that doesn’t acknowledge this in its design is not practical. Continually applying one’s reason in this way we can arrive at a sensible range of scales with good generality.

                  These are the principles which should guide us when determining a practical word length. Like considerations have guided the design of all our calculating tools. What good would an abacus be if it had so few rods as to make common mercantile calculations more awkward? Perhaps it seems less arbitrary now.

                  Many aspects of the universe appear continuous so it is natural we would need to do calculations involving numbers with fractional parts (rationals are a useful abstraction for many things besides). We could use fixed-point arithmetic but we have found that floating point is generally more efficient for most applications. A general purpose computer should have a floating point unit. Even Zuse’s Versuchsmodell 1 had one in 1938. A 16-bit float is useful in some applications, but a 32-bit or higher floating point is what scientists and engineers have always asked for in their computers.

                  If you want to talk about market success the fact that computers across many different eras, technologies and sizes with 32/36 to 60/64 bit word sizes have existed and continue to exist despite our evolving computing needs, suggests some level of generality which a smaller word size does not provide. The short career of 8-bit words in (semi) general purpose computing counts against it more than anything. It is even being displaced in its traditional avenues.

                  > the end-user increasingly doesn't own their computing experience. While 8-bit and 16-bit computing may not meet your arbitrary definition of sufficiency, it still provides an avenue for people to access, understand, and control their computing.

                  What does this have to do with word-size? Why even bother mentioning this? You seem to endlessly conflate larger word-sizes with “modern computing” and smaller word-sizes with specific past implementations or genres of computers. Don’t bother arguing the merits of a smaller word-size by appealing to unrelated virtues or ills.

                  By the way Wirth was not against the idea of small computers, he just though it was being done badly so he designed his own. People have always thought that computing could be done better, this shouldn’t be a surprise to anyone here.

                  > If it wasn't good enough for Wirth, so what?

                  It means that there are people with a far better sense for what is acceptable and unacceptable in computing than you.

    • nine_k 4 days ago

      Actually, something comparable to the original Symbolics Lisp Machines could be at the sweet spot where you still understand the whole system from the hardware up, but have powerful enough tools to be actually productive.

    • tines 5 days ago

      I agree completely. I've looked at designing a system around the Pi Zero, but it's so much work for someone with the time and skills that I (don't) have. And the Pi Zero doesn't seem to have the kind of I/O capability that I'm looking for.

      • thijson 4 days ago

        I feel like the RP2040 or RP2350 approach the simplicity of these older machines.

        https://www.raspberrypi.com/products/rp2350/

        The documentation is pretty good. Reading it reminds me of going through the documentation provided with my Tandy Color Computer as a kid.

        • 082349872349872 4 days ago

          I appreciate that when BCPL got its first floating point support, almost half a century after initial release, it was only because Martin Richards was writing a flight simulator for his Pi.

  • jonjacky 4 days ago

    The Tulip Creative Computer[1][2] hits most of your points (I'm just a customer). It is definitely not a retro computer. It uses modern technology (ESP32S3 microcontroller with megabytes of flash memory and RAM, USB ports, wifi, color touch screen etc.) and runs a modern programming language (MicroPython) that also serves as the operating system.

    This particular product might not be exactly what you want, but it shows that you can use these technologies to build a computer that is much simpler and more malleable than a modern PC in both hardware and software, but is still very capable, and intriguing to use.

    1. https://github.com/shorepine/tulipcc 2. https://news.ycombinator.com/item?id=41122986

    • scandox 4 days ago

      Weirdly that HN article you linked to is flagged

  • wvenable 5 days ago

    I grew up before the Internet and we still craved connectivity with our computers. I remember dialing into BBSes and playing turn-based text games and it was amazing. It was also the best way to get software; my computer would be pretty boring if the only software I had was what I created myself or purchased in a box.

    I also could have done so much more with all my computers, from my Commodore 64 to my 286, if had I had the vast information resources that are available now.

    • tines 5 days ago

      I think the difference is that in the days of the nascent internet, connecting with people meant much more than it does now. You dial into a BBS or log into a MUD and you have a small-ish community of real people that you can develop relationships with. Modern internet connectivity almost means the opposite: all the major services are oriented toward moneymaking, nothing is genuine, there is no sincerity, most behavior is motivated by accumulation of worthless social capital.

      So, the society that you craved connection with no longer exists now that you are able to connect. This is another thing that, seemingly, has to be rebuilt from the ground up locally.

      • icedchai 5 days ago

        I got started with a 1200 baud modem, back in the late 80's. I miss the local community found on BBSes and the early, text-oriented Internet providers. There seems to be no replacement for that at all. Any "local" oriented sub-reddit, Discord, etc. is full of bots and spammers.

  • andai 5 days ago

    I've wanted an e-paper laptop ever since I saw a Kindle ad in 2008. I'm also interested in ultra low power computing (solar charging, daylight readable, months of battery life, offline-first, mostly text...). So your list has a lot of overlap with mine!

    Such a thing doesn't seem to have been invented yet. The remarkable might come close (or that weird typewriter like thing?) but I haven't been able to justify any of those purchases yet...

    I'm not 100% sure about e-paper (the lag may actually be a feature reducing addictiveness), I'm also amenable to those transflective Sharp LCDs! (Though I think they're a bit too small for a daily driver.)

  • Findecanor 3 days ago

    I think a computer system checking most of those boxes could be made with software on modern PC hardware. I think that is possible as long as we keep check of ourselves to continue following the principles.

    Your list of qualities remind me of what I liked most about the Amiga and its OS. The (desktop) system was manageable by one person and you learned as you went. It did not "mollycoddle" the user, and it did not require you to spend years of study (as with Unix, Linux and the internals of MS-Windows).

    One older principle (1967) that is related to several of yours, and apparently largely forgotten these days is "The principle of least astonishment" [0].

    Other lists of computing principles I've come across that I find interesting (even if I do not always agree): Loper-OS's seven laws of Sane Computing [1], and the principles of Permacomputing [2].

    [0]: <https://en.wikipedia.org/wiki/Principle_of_least_astonishmen...>

    [1]: http://www.loper-os.org/?p=284

    [2]: https://permacomputing.net/Principles/

  • card_zero 4 days ago

    I'm struck by how badly we wanted the opposite of everything on your list, back when a C64 was what a kid was typically stuck with. Well, maybe not the opposite of point 6, there's nothing fun about a locked-down machine. But we very much wanted graphics, and sound, and 3D and simulation and virtual reality and MMORPGs (and internet, or at least improved telnet). And we wanted to be wildly distracted by a machine that does magic tricks and spins fantasies. OK, we wanted to be creative too, but as an afterthought.

    Possibly the joy of the golden age was in ruining computing for everybody who came after.

    • skydhash 4 days ago

      I started using computers with Windows XP and Windows 7 and the computer experience was way saner than what we have now (but that may due to the fact that everything was local in my town as internet access was scarce). It was a creation machine as well as a consumption one, mostly because it was not expected to have a 24/7 connection to the internet and software has to be file interoperable instead of having data silo in databases. No notifications other than system ones. And actual desktop design for the UI.

  • DowagerDave 5 days ago

    Interesting that the hackers in the book of the same name (by Steven Levy) were looking to change the public's perception of computers from something very different from what we see today to something very similar to what you describe. You both describe an end state that is very tactile, emotional and human.

    I struggle to get my kids to experience computers with the same sense of wonder and amazement that I had. This could be inevitable (I've always taken flipping a light switch or refrigerated food for granted) but it still makes me sad.

    • tines 5 days ago

      This is an interesting point of view, thinking about it I agree that instilling the "hacker ethic" into kids is very valuable. But how does one do that?

      For myself, I fell into video game hacking (and related hijinks) because I loved a particular game, and happened to hang out with some smart people on some forums online, and that set the course of my life to a large degree. But it seems like accidentally having these experiences is harder and harder these days.

      How do we set our kids up, or set the world up for our kids, to have these experiences organically?

  • MarkusWandel 4 days ago

    BTW back in the early C64 days (before 1985), the user base was "programmers" and "people who are going to learn how to do it any day now". Everybody played video games, but the machine was still seen as a challenge to create something with. The vast majority of the "any day now" crowd never did anything more than run ready-made apps of course. The "just buy it to run video games" thing came later.

    • tssva 2 days ago

      I got my C64 prior to 1985 and many of my friends did also. I was the exception by not being "just buy it to run video games".

  • bluescrn 5 days ago

    So much of the effort and expense has gone into creating a large plastic shell and custom keyboard, though, resulting in a costly product targeting a very small niche of the already fairly niche hobby that is retro computing/gaming (or even retro Commodore enthusiasts)

    And that large plastic shell of the C65 was always an ugly design, and I can't imagine it's comfortable to type on with that floppy drive protrusion so close to the arrow keys?

  • kstrauser 4 days ago

    I dug out an old HP-50g calculator I got a while back. I'm finding it trips my triggers in the right way, with nearly all of the points you made. It's a tiny, programmable computer that's fully knowable. I was surprised how much fine I'm having with it.

  • ryukoposting 4 days ago

    > The computer should be _reactive_, not proactive. It should never give you a notification. You should be in charge of retrieving all information yourself, like a library, not a call center.

    I think we all know how impactful the smartphone has been, but your way of putting it is especially succinct. It's not the iphone that mattered in particular, it's the convergence of phone and computer. The telephone was one of the first pieces of "noisy" technology, i.e. it yells at you when it wants your attention. Merging that with the versatility and sheer power of the computer was... consequential, to say the least.

  • ruk_booze 5 days ago

    Sounds like you may want a Commodore 64? Preferably equipped with an Ultimate 1541.

    Or if that it is too limited, go for the Amiga. It is more modifiable.

    As a sidenote, I got my Mega65 just the other day. Been waiting almost a year for it :)

    • tines 5 days ago

      Real Commodore hardware is going to break down eventually, I'd love to have something that uses modern parts so we aren't dependent on rare parts that are going to disappear or become super expensive.

    • DowagerDave 5 days ago

      vintage hardware is too expensive. I'm not sure why, or who's buying it, but if you want to actually run & program on it you're better off with emulators

  • lproven 2 days ago

    > What other machines are out there like this?

    Quite a few.

    * Feonix 256

    * Commander X16

    * ZX Spectrum Next

    * ZX UNO / ZX DUO et al.

    * Chloe 280SE

  • CalRobert 4 days ago

    Then their school gives them a tablet because math on paper is old fashioned. But I agree

  • robinsonb5 4 days ago

    > I've been feeling lately that as computers have become more advanced and software has become more inscrutable, our relationship with our computers has changed, for the worse.

    Very much so. Technology stopped being about empowerment some time ago - it's been subverted into a tool for erecting virtual tollbooths.

    > This essay hit home for me: https://explaining.software/archive/transparent-like-frosted...

    Thanks for that - I just read it. This made me grin:

    > "By the mid-1990s," Turkle says, "when people said that something was transparent, they meant that they could immediately make it work, not that they knew how it worked."

    ... and by the late 1990s people meant they could see its PCB through the case!

  • GenericDev 5 days ago

    I agree with you a million percent, so you're not alone in this. But we are very much the minority :(

    It feels like people aren't interested in being creators. Just consumers. And that shows in how media and companies refer to people as consumers.

    I wish there was a way to reverse this trend. It feels in many ways like a Plato's cave kind of situation.

    • sfjailbird 5 days ago

      Even back when all the kids had C64s, most only knew enough about it to load up games from the tape drive. Personally I was intrigued by the built-in basic, and that got me started programming (and I absolutely loathed the mindless consoles like the Nintendo Entertainment System), but I was very much in the minority.

snozolli 5 days ago

That box design is giving me huge nostalgia waves.

A program most nerds (including me) used to run on every computer we came across in department stores back in the 80’s. The salesmen must have been sick to death of kids doing this all day every day – some of the messages weren’t always so polite either!

As a diehard Amiga fan, I would always stop at the Macs on display at office supply stores and switch the color setting from "thousands of colors" to grayscale. Just doing my part.

In programming class at school, we'd be taught for a bit, then go to the computers to practice what we'd learned in BASIC. When class was over, we'd write programs to loop for maybe 15 minutes, then emit an obnoxious sound to interrupt the next class. Ideally, the chosen frequency would be high enough that the teacher couldn't hear it, but the class could. Truly madlads.

  • gwd 4 days ago

    Oh man, Mac and their "thousands of colors" -- that lit up some neural patterns that have been dormant for a long time!

PaulHoule 5 days ago

My favorite modern retrocomputer these days is https://www.olimex.com/Products/Retro-Computers/AgonLight2/o... which is orders of magnitude more powerful than the computers it is modeled on but affordable.

  • the_af 5 days ago

    Is that just a board or a full computer with a case, keyboard, etc?

    I find the all-in-one kind of retrocomputers more appealing than the DIY projects (knowing nothing of electronics or of sourcing parts, cases, etc, DIY is not for me).

  • NikkiA 5 days ago

    The ez80 is nice, but it's not 'orders of magnitude' more powerful than a regular z80, especially clocked at 20Mhz (the CMOS Z80C was available up to 20MHz).

    (The ez80 is capable of 50MHz, I'm not entirely sure why they limited it to 20Mhz in the Agon Light).

    • Lerc 5 days ago

      A just had a quick look at the ez80 it and it seems like it's pipelined, so the instructions per clock will be a lot better than the z80. A 20MHz ez80 is probably one order of magnitude improvement than the oftentimes 4MHz z80.

      Wikipedia says three times faster at the same clock speed. So 20*3/4=15 give or take.

      As an aside, last time I did napkin math estimation on the 8-bit AVR it was faster than a 68000 at the same clock speed. The 68k took 4 clocks to do register to register, AVR is mostly one clock so could do most 32 bit operations as fast or faster using multiple instructions.

      • PaulHoule 5 days ago

        I love AVR8 assembly so much and how AVR8 is so much better than any of the 8-bit micros in so many ways except for the small RAM size (though it does make it up in ROM)

        • Lerc 5 days ago

          Me too, I made a silly fantasy console using it

          https://k8.fingswotidun.com/static/ide/?gist=78d170a65bc6c9d...

          In hindsight, it's not the best for emulation, I think most of the emulator time goes into extracting the bits from instructions. I really should make a translator that turns it instruction by instruction into something easier to decode.

          • wk_end 5 days ago

            On a CPU with 8- or even 16-bit instructions, if at all possible (I don't know much about AVR8) it's probably better to use jump tables than to try to decode bit-wise when emulating I think.

            • PaulHoule 5 days ago

              I’ve thought about making an AVR-8 emulator in AVR-8 assembly language that would let me run tiny programs out of RAM where the emulator (host) would use some of the registers and the others would be be available to the guest. This way you could upload a function over the serial port and run it. I figured it would be a good way to really master AVR-8 assembly.

              The ESP32 has a huge amount of potential for emulating other things, see

              https://github.com/breakintoprogram/agon-vdp

              For a display controller implemented for ESP32.

    • talideon 5 days ago

      Look, if you're going to get picky, it literally is at least an order of magnitude faster. And in addition, Hz for Hz, the eZ80 is much faster than the Z80A, which is what people would actually be comparing it to.

MarkusWandel 4 days ago

You have to have played with Commodore BASIC's PETSCII graphics abilities "back in the day" to really appreciate them. With no starting skill you could do animations, rudimentary video games and what have you, all in BASIC with just character graphics.

The reason it worked so well was the cursor control characters. Just by printing a string, for example, you could draw an outline box with text in it, or a little man, or whatever, in an instant.

The speed of BASIC was still an issue. I animated a train driving across the screen, about 10 characters high, and it worked fairly well but you could see a bit of a ripple. I don't remember how exactly, but, for example, each 1-character wide "slice" of the train could be a string, then you just print your 40 "slice" strings in a row and there's your train; pick your starting offset in a larger array to draw it in different phases of motion.

A faster CPU totally solves this. Now you have a machine where non-programmers can do really cool graphic stuff and smoothly too, without ever leaving BASIC.

The next step, generally, was about reprogramming the character set. Now your BASIC, character cell based graphics could have custom pixels, not just the preformed PETSCII characters.

I once saw a cute little character based platform jumper game on someone's VIC20 and went home and implemented it, from scratch, on the C64 in an afternoon. In BASIC, with a few custom characters.

But what may be missing in this retro scene is being able to show off your creations to everyone else who has the same computer. Without that, kids may not get interested.

timbit42 4 days ago

This reminds me of the Foenix series of computers by Stephany Allaire, the F256K2 which is much like what a Commodore 256/512 might have been like as a successor of the Commodore 128.

They are available in 8, 16, 24 and 32-bit systems with a variety of CPUs such as the 65c02, 6809, 65816, 68000, or 68040.

Main website: https://c256foenix.com/

Wiki: https://wiki.c256foenix.com/index.php?title=Main_Page

mcejp 5 days ago

I am mildly impressed that the floppy drive is not a supply chain liability nowadays.

  • reaperducer 5 days ago

    I've never understood why retro computer enthusiasts go through such effort to replace their floppy drives with CF cards, when Sony had a solution almost 25 years ago.

    My DSC-30 came with a metal floppy disk that has no moving parts. But you could insert a Memory Stick into it, and then stick it in any 3.5" floppy drive and read the stick as FAT.

    Every time I see someone on the VCF forums struggling with the latest floppy drive replacement board I wonder what ever happened to that technology.

    • an-unknown 4 days ago

      Simple: a real 3.5" floppy disk drive has moving parts and various things that age and eventually break. For example I have an old device with a broken floppy disk drive which can't even read a real floppy anymore. With the metal floppy "emulator disk" you mentioned, the FDD itself still has to be fully functional in order to read this "emulator disk".

      A floppy emulator board which reads SD/CF cards or USB sticks doesn't have that problem at all since it's purely solid state electronics and directly connected to the electronic interface of the FDD instead of the real FDD, and usually you can put thousands of floppy disk images onto such a memory card/stick and select which disk image is to be put into the emulated floppy disk drive ⇒ there is simply no need for the "emulator disk" technology you mentioned anymore.

    • kevin_thibedeau 4 days ago

      Those floppy disk emulators require special drivers to prevent the drive head moving off the transducer. It isn't worth the hassle to get them working on non-PCs.

    • JPLeRouzic 4 days ago

      > "metal floppy disk"

      That's an interesting combination of words!

  • topspin 4 days ago

    That floppy drive looks like it was deliberately designed to support a coffee cup.

jandrese 5 days ago

According to other sources on the internet that 12 pin header by the removable access cover connects to GPIO pins on the FPGA.

https://shop.trenz-electronic.de/media/pdf/ca/ca/31/Mega65-P...

mass_and_energy 5 days ago

Didn't one-byte man do a good review on this product?

  • unwind 5 days ago

    Yes, here is the MEGA65 video from The 8-Bit Guy: https://www.youtube.com/watch?v=8qHdTKjPXww.

    • MarkusWandel 5 days ago

      Speaking of whom, I'm surprised nobody has mentioned his Commander X16 project yet. Much in the same spirit, a "fantasy C64 successor", just not based on the C65. Looks like about the same market penetration too (on the order of 1K units to date).

      • timbit42 4 days ago

        MEGA65 is around 1,800.

stonethrowaway 5 days ago

A fully assembled computer? Get out of here with your fancy X2 safety caps and shock proof solder joints.

Now where’d I leave those Galaksija resistors…

vunderba 5 days ago

I've been following this project pretty closely and even though it is based on the prototype Commodore 65, I kind of wish they had just gone with the superior aesthetics of the classic c64 for the outer shell even if that would have been less accurate. And the extra long spacebar, just ugh.

IronWolve 5 days ago

Thats pretty cool, expensive but nice features and so expandable. Looks like a large community too.

  • the_af 5 days ago

    Interesting. How many MEGA65 units exist out there?

    • layer8 5 days ago

      The three batches produced so far are 400 + 400 + 1000 = 1800 units.

bezkom 5 days ago

Doesn't that name risks to be confused with Atmel Mega CPUs used in Arduinos?

https://www.microchip.com/en-us/product/atmega64

  • ziddoap 5 days ago

    I don't think there is much risk of confusion between a microcontroller and an all-in-one retro computer. They also have different numbers (64/64A vs. 65) and the Mega65 isn't prefixed with an 'AT'.