I'm a confessing user of em-dashes (or en-dashes in fonts that feature overly accentuated em-dashes). It's actually kind of hard to not use them, if you've ever worked with typography and know your dashes and hyphenations. —[sic!] Also, those dashes are conveniently accessible on a Mac keyboard. There may be some Win/PC bias in the em-dash giveaway theory.
A few writer friends even had a coffee mug with the alt+number combination for em-dash in Windows, given by a content marketing company. It was already very widespread in writing circles years ago. Developers keep forgetting they're in a massively isolated bubble.
The site covers mostly retro- and classic computing. (Strictly no AI generated content.) Here in convenient format:
(
:name "Norbert Landsteiner"
:site "https://masswerk.at/"
:blog "https://masswerk.at/nowgobang/"
:feed "https://masswerk.at/nowgobang/feed.xml" //covers blog only
:about "https://masswerk.at/info/" //legal info
:hnuid "masswerk"
:bio "web developer and designer, site content is mostly retro- and classic computing."
)
Your Pet 2001 emulator's pretty cool, and seems much more capable and user friendly than a lot of the downloadable alternatives.
Actually a bit of an issue, its so capable, I actually have difficulty justifying a downloadable alternative, even though I'd prefer to have a local copy due to the untrustworthiness of web apps over time.
Thank you! I can see that this may a bit of an issue. However, this being a simple webpage is what allows to quickly tinker with this and push a quick update, which is also how much of this has kept growing.
The choice of AppleSoft BASIC for a recreation seems to be somewhat odd and deliberate, and doesn't represent the typical limitations of the time (AppleSoft BASIC does floating point math!): in August 1975, the MOS 6502 hadn't even been announced and the Apple ][ wasn't yet a dream. Even Microsoft 4K BASIC for the Altair hadn't been introduced, yet (this was to happen only later in October.) Meaning, none of the basic technology of choice would have been available.
Something along the lines of Intel 8080 assembly may have been more appropriate, given that the target platform would have probably been a coin-op machine. (Given that "Gun Fight", the first arcade video game utilising a microprocessor, wasn't yet released, even this would have been an ambitious choice. Atari doing research for something that required an ALU may be even more interesting than the involvement of the young Steve Jobs.)
PS: This is just to give some truth to "this is where the hackersnews jerks will say this is an ad". ;-)
Yeah... it's a bit unclear to me what hardware this was even supposed to run on? The home and arcade video games Atari was producing at the time (Pong and later Breakout) were based on discrete logic chips, so weren't "programmable" in any modern sense of the word. As you wrote, the 6502 was only introduced later in 1975, and designs using it came even later.
Notably, Dave Nutting Associates (Bushnell's former employer, who also distributed Computer Space) had played around with an Intel 4004 in 1974 and then demonstrated (to Bally) a CPU based system with a frame buffer in September, which evolved into the Intel 8080-based board that ran Midway's Gun Fight. Atari would have probably been aware of this (Nutting Associates had filed a patent.) So, something along the lines of Intel 4004 or 8080 machine code, maybe M6800.
Curious detail: the button/widget icons of the browser chrome are composed of multi-layered box-shadows (i.e. one box-shadow definition per pixel or line, concatenated to a sting) of a `:before` pseudo element. (I don't think that I've seen anything alike before.)
For example, when a process implies a conversion according to the contract/convention, but we know that this conversion may be not the expected result and the input may be based on semantic misconceptions. E.g., assemblers and contextually truncated values for operands: while there's no issue with the grammar or syntax or intrinsic semantics, a higher level misconception may be involved (e.g., regarding address modes), resulting in a correct but still non-functional output. So, "In this individual case, there may be or may be not an issue. Please, check. (Not resolvable on our end.)"
(Disclaimer: I know that this is a very much classic computing and that this is now mostly moved to the global TOS, but still, it's the classic example for a warning.)
I'd suggest a simple test: remix menu items and icons and test, if this has significant impact on usability. If not, the icons are just arbitrary decoration and ultimately add clutter.
Referring to the examples provided in the article, I'd suggest that the impact on the Safari app menu should be minimal (so these are non-functional icons), while the impact on the Move & Resize submenu would be devastating and should result in confusion (so these are essential).
If you can remix with minimal impact, don't do icons. (In the case of the app menu, these are apparently meant to add structure, which is already established by other means like menu separators, so you have now two – or, including indentation, three – systems of structure and visual hierarchy that are fighting each other.)
Moreover, if you put icons everywhere, you're forgoing the facility to convey state, like active state checkmarks, since these, instead of standing out and signalling change, would be just drowned in the decorative clutter. (So what's next? Add color and/or animation, like swirling checkmarks?) And this, BTW, is also why the icons in the Move & Resize menu are effective: they are conveying and illustrating state (in terms of a preview), while most of the other menu icons (mostly referring to activities) do not. So, as a rule of thumb: icons referring to state may be useful and even desirable, while icons referring to activities are probably better left out. (And, if you feel the need for something like bullet points to mark your most important menu items, there's probably a deeper problem with your menu structure.)
reply