The Apache TVM machine learning compiler has a WASM and WebGPU backend, and can import from most DNN frameworks. Here's a project running Stable Diffusion with webgpu and TVM [1].
Questions exist around post-and-pre-processing code in folks' Python stacks, with e.g. NumPy and opencv. There's some NumPy to JS transpilers out there, but those aren't feature complete or fully integrated.
+1 for ninja. I first encountered is using Apache TVM, which uses CMake. I was first of all using the default make backend, and it was taking a while. Then the docs said to try adding the `-GNinja` flag to Cmake to build with ninja. I was blown away by how much faster the compilation was, and now try to use it whenever possible.
One issue is that make will default to only using one job (i.e. one cpu), and you need to pass `-j NUMBER` to make it use more, while Ninja is parallel by default.
For my uses, I've not found `ninja` to be much faster than `make -j8` on an 8-core machine.
I'm surely also in that range. It took us years to write this project, but 57 C files is a small project. Others find themselves in the range where the build decision-making itself can take seconds, with tools other than ninja. Ninja is pared down for speed; having to declare keywords before I used them made me feel young again.
I had to write these build files over again, and I just couldn't stomach the idea of writing another makefile.
This is used heavily by hardware tokens like Yubikeys, and especially for cryptocurrency people with so called "hardware wallets" like Trezor and Ledger (which can generate many subkeys on the devices).
Companies like Google now have employees issues with Yubikeys [1].
Signal keeps all downloaded media locally until you delete it.
They don't have the resources to store files on the cloud, even encrypted, and don't appear to have taken WhatsApp's approach of backing up unencrypted media and messages on user's third-party cloud services like Google Drive and iCloud.
You can mitigate this by having disappearing chats (current longest self-destruct time is 4 weeks), or by going to Settings->Data and Storage->Review Storage and deleting the largest files.
This isn't a great UX design, as users are not informed there is a problem, or how to solve it.
Whatsapp can be configured to not save all the cat photos and memes to your library by default. You can still save the really good memes yourself if you want. Signal should just copy that feature.
Also, what good is secure encryption if i have to give out my phone number?
> Also, what good is secure encryption if i have to give out my phone number?
Actually how could you possibly deliver secure messaging if it doesn't work with simple identifiers you already have like your phone number? Everything should be secure, that's Signal's thesis.
This reminds me of the people who were convinced HTTPS should only be used for "important" stuff that "needs to be secure" like banking and so it's wrong to have HTTPS on your blog, or news site, or whatever.
> Actually how could you possibly deliver secure messaging if it doesn't work with simple identifiers you already have like your phone number? Everything should be secure, that's Signal's thesis.
It's tying my Signal identity to my phone number. To speak in US terms, you're safe from your comms being intercepted by the KGB, but now you're a person of interest to the CIA :)
I started software dev in the mid-2010s, and had never heard of Borland, until I heard a former colleague mention it having great built-in debugging tools and that they missed it.
Same colleague had some vocal criticism of `gdb` as a debugging tool, and the state of Linux-based debugging tools as a whole, with claims that "Borland's were much better, and Visual Studio (not VS Code) being one of the few development environments with a quality debugger".
I'm not sure how fair that assessment is, I've found `gdb` to be a helpful tool, though I've never used Visual Studio.
I'd say Delphi was one notch better than Visual Studio. The pascal language was more highly optimized and easier to parse, so the debugger was much snappier about popping up the tooltips to show you values of variables or F1 to jump into the help pages. I remember being especially disappointed with VS because they would spend hundreds of MB of your precious disk space installing the entire Microsoft Knowledge Base, and the hitting F1 on a code statement would bring up a mishmash of Visual Basic examples (while developing C++) for almost but not the right class. In Delphi, it always knew exactly what class you were dealing with and would open the correct help page before your finger was fully off of the function key. The help pages had been expertly written to show you the most important details first, and were easy to browse. Really, I'm surprised I don't see more people reminiscing about those help files. There was probably as much effort put into that as into the frameworks or compiler or IDE.
Agreed. The help pages were really top notch. Other software's help pages were basically useless to the point that no one read them. In Delphi not only you had to the point explanations but also examples that frequently solved the problem one had in the first place. Sadly, as someone pointed out, most of this art seems to be forgotten with time.
You could say the only fault with those help pages is that they were too good: Some time after v. 5 they stopped including those lovely paper manuals, probably because they figured the help pages were good enough to replace them.
I would be keen to try, but afaik there's no way to do it on Linux which is my main dev/deploy environment.
It feels like Linux debugging is stuck in a viscous cycle, since few people are putting the capital into a decent debugger UI, and thus few people are using debugging UIs (and thus using printf debugging, or gdb CLI). Folk might not realise how much better it could be.
It’s night and day. The debuggers in those two tools were very easy to use, very visual, but surprisingly powerful.
I cry a little on the inside when I see developers using Visual Studio and resorting to printf statements (or the equivalent) because they’ve never even tried to use the debugger, ever.
> BTC mining is just one of those operations that is resistant to a carbon tax since the price of BTC and therefore the mining rewards can just rise to compensate.
I don't follow your logic, could you elaborate? One could say that about other things subject to carbon tax. Like all goods and services, the price is set by how much people are will to pay for it.
"the price people are willing to pay for short-haul flights could just increase, and therefore the airline profits are unaffected."
"the price people are willing to pay for aluminium could just increase, and therefore the smelters are unaffected."
They don't really make sense. Carbon taxing would change miner behaviour, because greener energy would be more attractive (by virtue of being cheaper, especially with carbon taxing and reduced fossil fuel subsidies).
[1] https://en.wikipedia.org/wiki/Great_Green_Wall_(China)