Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
The D Programming Language (dlang.org)
198 points by LorenDB on Dec 4, 2023 | hide | past | favorite | 242 comments


D is a niche language that didn't quite manage becoming mainstream. I came across an excellent article Why your F# evangelism isn't working (https://ericsink.com/entries/fsharp_chasm.html) that explains why this didn't happen for F#, but the insights generalise to other languages. Here's my summary:

- Adoption of any technology is by a normal distribution.

- The segments are early adopters, pragmatists, late adopters and laggards.

- If you want a technology to become mainstream, you need to make the pragmatists adopt it.

- This is rational behaviour and there's no use trying to convince the pragmatists otherwise. They will choose "predictably disappointing" over "excellent and unproven" every time.

- Technology is merely a tool to them. They don't care if it's fun or not, as long as it gets the job done.

- Mainstream success is impossible without getting the pragmatists on board. Until then you have a curiosity used only be early adopters.

- How to cross the chasm? He recommends finding a pragmatist in pain and solving their problems. Gradually convert pragmatists one by one.

I don't think D was ever able to articulate to the pragmatists why they should adopt D over C++ or other languages. Fixing a few warts with C++ wasn't enough.

Meanwhile Rust managed to appeal to the pragmatists by saying "we know you use C and C++ because of the performance despite the issues with it. Why don't you try a language with the same performance, which also fixes some of the issues (buffer overflows above all) at compile time".

Rust was far from perfect when it started seeing adoption by pragmatists, and it remains far from perfect today. But perfection isn't the goal, nor is a "fun" language for early adopters. What matters is solving a problem that pragmatists face and then convincing them to give you a shot. In practice this is a bar that is high enough that most languages never cross it.


You also need to take history into account

- non-free compiler

- GC (when Java, C#, etc already ate into that C++ segment

- competing standard libraries

These have been solved in different ways but diseminating that knowledge and changing people's gut feels is hard. Its then made more difficult by competitors coming out before overcoming this (e.g. Rust)


Missteps are fine if it’s possible to make a compelling pitch to a pragmatist in pain. But was there ever such a pitch? Not that I recall.


Check the year when David announced D support in GCC - https://forum.dlang.org/post/c3mnst$2htg$1@digitaldaemon.com . So yes, you might be right about non-free compiler until 2004.

GC - you can build entire application _without_ GC. Nobody forces you to do that.

Competing standard libraries are thing in the past after the Tango project has been abandoned.

So really, what are you talking about?


I acknowledged that in my comment. My point is it takes a lot of work to undo the damage to people's perception and that time you lose from doing so gets your community that much further behind.


> GC - you can build entire application _without_ GC. Nobody forces you to do that.

This is a gross misrepresentation. With D you must opt-out and be cognizant of libs they may or may not require GC.

> Competing standard libraries are thing in the past after the Tango project has been abandoned.

We are discussing why D failed adoption where the history and past are INCREDIBLY important factors.

So really, what are you talking about?


It is not necessary to opt-out of the GC. Just don't use it.

If the @nogc attribute is used, the compiler will tell you where the gc is used.


My hot take: when D intends to compete with C++, having an opt-out GC is a deadly disadvantage.

It is difficult to prove to the performance-oriented people that GC'ed languages can be performant, and years of JVM stop-the-world GC didn't help to improve the reputation of GCs either.

Even though D has @nogc, non-D programmers would probably have serious reservations as to whether the ecosystem is split between GCed and GCless, and how much that split would affect them and their project.

IMO, Rust did it quite right in this regard by making GC opt-in (with rust-gc) - although borrow checking is an imperfect way of doing safe memory management (eg. cyclic references), the fact that it attempts to be a "safer C++ at no runtime cost" certainly has pulled some people over.


The whole GC thing is a non-issue to a pragmatic programmer. D's GC is not going to pop out and smite your program. It's just one of many methods of dealing with memory that D provides.

The "serious reservations" is simply unfamiliarity.

The GC has uses where it shines. Allocating data structures that last the lifetime of the program instance. Enabling data allocation during Compile Time Function Execution. Scripting programs.

Borrow checking has its costs, too. Some data structures won't work with it. Sometimes to conform to it means allocating an extra copy. Some code has to be marked as unsafe.

D has a borrow checker, too. It's opt-in, though.


Btw. the "Chasm" idea is from the book "Crossing the Chasm: Marketing and Selling Disruptive Products to Mainstream Customers" by Geoffrey A. Moore

D has two problems: it doesn't have enough "new" stuff to appeal to the "early adopters" - Rust has sum types + pattern matching, type classes and the borrow checker and, as you've said, it couldn't convince enough "pragmatists" to switch.


D has a borrow checker, too.


For anybody interested: https://dlang.org/blog/2019/07/15/ownership-and-borrowing-in...

You really aren't the best language salesman on earth ;). And that's great!


You're right, I'm born to be a nerd, not a salesman.


Besides the "pragmatist in pain" route to mainstream there's also the "benefits for free" route.

C++20 is a very different language than C, but it got to mainstream because at the beginning it was "C with classes". Typescript is similar.

Every company has a bunch of "early adopter" engineers. The easiest way for them to introduce something new at their company is by claiming that the transition will be seamless and free.

D partially tries this path with its C compatibility. But it's a rare C codebase that doesn't use function pointers or array pointer syntax. I don't think D had any choice but to abandon those, but it made this path a lot harder.

P.S. there's a third route: piggyback along with a market transition. That's how Java & Javascript & Perl went mainstream.


D supports function pointers and array pointers, and always has.


But not in a C-compatible way, right? IIRC, you can't just take a good size C file, put a tiny bit of D in it and compile it with the D compiler, can you? Happy to be corrected.


Yes, it's semantically C-compatible. They're close enough that converting C code to D code manually is straightforward. But it isn't necessary to do that, you can simply import C code into D code:

    // D code

    import mycstuff; // loads mycstuff.c
    int main() {
        int* p = mycfunction(); // call C function
        return *(p + 1);  // pointer arithmetic
    }


Translating small syntax differences like that are easy to do by machine if you have to.


>Rust was far from perfect when it started seeing adoption by pragmatists

Isn't Rust still a niche programming language? When compared to C, C++, C#, Javascript, Java, Python?


It certainly is and if alone because the corpus of all Rust software is minuscule compared to that of other mainstream languages.

With Rust being in the kernel of both the most mainstream operating systems and it gaining traction in the embedded world it would require a catastrophe of gigantic dimensions to bring that tanker off course from becoming mainstream.

It will take time of course, but that is no difference from other contemporary mainstream languages. I remember that Guido van Rossum being asked what he'd consider Python's breakthrough and he answered that it had none, that it just grew slowly and steadily over a long time. Now, things have been set in motion for Rust, we will see where it leads.


I think Rust will remain a niche language because its main benefits compared to its learning curve are high.

Also, Modern C++ is reasonably safe for most uses, almost safe, and things like Hylo language are IMHO just a better model than Rust-borrow-everything, which is basically viral at the API level and hard to manage compared to mutable value semantics.


Except Microsoft recently decided, even being a C++ powerhouse, that all Azure systems programming that would have been done in C++ in the past, will be done in Rust from now on.

This is bound to 10 million dollar engineering effort, alongside one million more as donation to the Rust Foundation.

The issue with Modern C++, is that we are already in a Post-Modern C++ phase, and plenty of people keep writing classical C with Classes instead.


What learning curve? Between the borrow checker, copilot, and gpt it’s not really hard to know when to use &.

I’ve been rewriting my python personal project in rust the past month and it’s going much quicker than I expected. And with Dioxus I’ll be able to distribute a full multi platform gui app at 10mb vs python + JS and all the fun that comes with that.


Not everyone works in gpt and copilot. Not only using &, but annotating 'a and 'b and 'c all the call stack and when refactoring, redo it again.

I do not have extensive Rust experience, but the times I tried it got on my way. My background is C++ (20 years using, 14 years working on it). They say that best practices in C++ are in Rust, but I think it is not that similar sometimes even from my mindset when thinking in C++. It is just quite a bit more ceremony in real life IMHO what I saw.


The problem is you're probably well above the median developer in terms of talent. Now imagine all the enterprise devs that do boring Java and C# and only got into software because of the pay not because of any passion for it, and imagine how easy it would be for them to learn Rust.


Lmao Ty I’ll put that on my resume if I ever apply for a dev job.


I have not yet tried copilot with Rust, but when GPT-4 came out I made many experiments with Rust, Python and Java to see what it can do.

My conclusion back then was that Rust was by far the worst and that it is mostly because GPT-4 just didn't get the borrow checker. Coming from C/C++ and being a bit of the opinion that the people complaining about Rust's steep learning curve are often a bit overdramatic, it convinced me that they might actually have a point. If even the LLM doesn't get it...


>I think Rust will remain a niche language because its main benefits compared to its learning curve are high.

Honestly? I simply don't think the learning curve of Rust is worth mentioning, or at least it is in my opinion on the same level as C++'s. Or putting it this way: using C or C++ does not mean that you are free of managing the objects' lifetimes.


In Rust you have to annotate all lifetimes, when refactoring them, do it again all the way up and down the call stack.

Since it does not have exceptions, same for propagating Result type up and down. Both go viral. The compiler is strict.

Of course you have to think about lifetimes in C++ but it is more flexible.

That said, Rust pattern matching and traits are nice.

Rust is stricter and the learning curve is steeper overall IMHO.

I really think a borrow checker is not the way to go for safety. Hylo language model is what I would choose first above everything else.


That's the thing. You still have to think about lifetimes when you refactor C++ code too. I have accidentally introduced segfaults refactoring my own C++ code, that is for certain.

You still have to do all the hard work thinking about lifetimes even in C++, things like whether lifetime A will surpass lifetime B and all that jazz. It's just not spelled out.


> That's the thing. You still have to think about lifetimes when you refactor C++ code too. I have accidentally introduced segfaults refactoring my own C++ code, that is for certain

As long as you work a bit defensive and do not use references or raw pointers and use smart pointers and values (span and string_view are NOT values, they have reference semantics), things should be ok lifetime-wise.

It is just that we just want to go, sometimes, a tiny bit faster and fall into the temptation of returning a reference or similar and we mess it up when refactoring. But if you take into account the 80/20 rule, then sticking to safe practices should be the default most of the time and leave a pointer or a reference for an inner loop or something performance-critical in a very controlled environment is the wise choice.

In my experience, sticking to these practices make things work very well in practice. I rarely see a segfault in my code when coding C++. I also use `.at()` systematically, btw. And I am very conservative when using `string_view` and `span`.


This is a really good point and I agree that this is challenge for Rust. The counterpoint I have to offer is that the learning curve is reasonable flat when you are coming from C/C++ and that that is a significant enough target group to pull Rust trough until its new ideas become mainstream. I still remember when object oriented programming came around and that many programmers found it quite alien. A notion that is hardly conceivable today. I hope that it will be similar with Rust.


Why go through all the trouble when you can do this: https://www.hylo-lang.org/ and not spend a second thinking of lifetimes? No, copies will not be issued unless necessary.

Or why not keep exploring this idea as well? More research-oriented than the first one right now, though, so take it with a grain of salt: https://vale.dev/ Look at "generational references".

I find the first solution for Hylo as the correct mainstream one and the one in Vale as promising, though not sure where it will lead or if it can keep up to its promises.

Both remove from you a high cognitive overload and keep memory safety intact without a GC.


I think in the grand scheme of things, it probably is. But it seems to have a momentum behind it that I can't quite see in, say, D or F#. Rust's popularity ratings in Stack Overflow's annual survey are pretty impressive https://survey.stackoverflow.co/2023/#section-admired-and-de...


For JavaScript there's been a sharp increase in the amount of tooling that is written (or rewritten) in Rust: Turbopack, SWC, Vite's upcoming rewrite of roll-up. It doesn't seem like it is competing with JavaScript for writing web applications, but the tooling is definitely seeing more adoption of Rust.


In general Rust is great for writing developer tooling. You need low latency, high throughput, low memory footprint while being stable enough that it doesn’t crash much. Projects like Turbopack, Ruff (Python linter), Biome have all had some success and may continue to.


If you read about diffusion of innovation, there is the classical example that cure for scurvy was known way before being adopted: thttps://web.archive.org/web/20151022041843/http://www.idlewo...


A well written and absolutely gripping & interesting tale, also discussed here at the time (including by the blog author) https://news.ycombinator.com/item?id=1174912 and re-posted many times without as much ( https://hn.algolia.com/?dateRange=all&page=0&prefix=true&que... ).

Particularly striking was how concurrent innovations like copper tools, steam boats, and milk pasteurization derailed only partially developed scurvy theory. One wonders if this story partly inspired Woody Allen's joke in _Sleeper_ about steak being found to be good for you.[1] { Probably more the constant thrashing about of recommendations that has infected nutrition science since forever, though. }

[1] https://www.youtube.com/watch?v=1yCeFmn_e2c


> Meanwhile Rust managed to appeal to the pragmatists by saying "we know you use C and C++ because of the performance despite the issues with it. Why don't you try a language with the same performance, which also fixes some of the issues (buffer overflows above all) at compile time".

And introduces a straight jacket that avoids valid patterns from being used and makes binding C++ a hell.


Like I said, it doesn’t need to appeal to everyone to succeed, just a critical mass. Binding Rust to C++ isn’t easy, but the value proposition is strong enough that Firefox and Chromium (early days) projects use Rust in their massive C++ codebases.

But who knows how well a language that prioritises interop with C++ will fare? I guess we’ll find out Carbon 1.0 shows up in 2025. Similarly Zig with its fantastic C interop.


It can appeal niches with critical mass. Then it will not "mainstream". It will be something else.


I’m defining mainstream language as one that will continue to be actively developed 10 years from now, will have a thriving ecosystem of libraries and will be easy to hire developers for. If you have these three, you can start a new code base in that language today without worrying that you’ll be left with legacy software in a few years that’ll need to be rewritten.

This definition of mainstream is more useful than “almost all devs know the language”, which is something that only applies to Python and JavaScript anyway.

By this definition, you probably don’t want to start a new codebase in COBOL, Perl, Objective-C or other languages that are trending downwards in usage. But you can use languages that have reached that critical mass like Rust and Go, even if every single developer might not know them.

I’m basing what I say on what languages developers said they knew in the 2023 StackOverflow survey (https://survey.stackoverflow.co/2023/#technology-most-popula...). You definitely would call Java and C# mainstream, and they’re used by 31% and 28% of developers respectively. Go and Rust are used by 13% of devs. Not in the same league, but they’ve both reached critical mass so they won’t disappear.


> I’m defining mainstream language as one that will continue to be actively developed 10 years from now.

Well, then Haskell or D are mainstream... but I would not say they are.

I would say mainstream are: C#, Java, C++, C, Python, Javascript, SQL... but not Haskell or D.


or you could simply be supported by a billion revenue company that pays your bill if you code "their custom version of C". No need to do social analytics. D could totally be that language but it was never driven by big enough companies and had a specific early years history which obscured it pretty well from pragmatists crowd.


> never driven by big enough companies

Not sure that's true. One of the first adopters of D in production was Facebook, because one of the engineers (Andrei Alexandrescu) worked there and evangelised it heavily internally. Many projects were written in D. But eventually the cost of maintaining support for one more language in production didn't make sense. D didn't solve problems that C++ (another supported language at Facebook) didn't. The D code was rewritten. It's not that Big Tech didn't adopt it. It's that it didn't show enough value in time.


Those days D2 was probably not ready to be adopted in prod anyway and Andrei worked as a research scientist on a specific problem so I don't think he was on "that" level of corporate influence. He then left FB to work fully on D.

When you read about D and why it is not popular, one thing to keep in mind, is that there was D1 for a long time and then came D2 and this process was not as smooth as it could be. The dev progress was and stays slow but steady. D is in a better place today than it was 3 years ago for example.


Andrei Alexandrescu went back to C++ and is working at NVidia CUDA nowadays.


He's the Vice President of the D Language Foundation, although he's not contributing as he used to.


It's safe to say his salary is high enough that Rob Pike would have done the same thing. Important to note that he left his D leadership position years before moving to NVidia and he's still involved with the language.


Facebook didn't hire a dozen people to develop the compiler and ecosystem.


> Many projects were written in D.

What's your source on this?


Internal document at Meta written to explain why Meta only officially supported 3 languages at the time for server side development - Hack, C++ and Python. Rust is supported as well now, bringing the total number to 4 (https://engineering.fb.com/2022/07/27/developer-tools/progra...). The post explained why D was used at one point, but the use was phased out.

I no longer have access, but if a Meta employee is reading this, I'm referring to a post that called Alexandrescu a "whirlwind of energy" in terms of how he personally encouraged and enabled teams to adopt D for projects.


Hmm, yeah, I never saw any internal stuff (I'm just a random person on the internet), but my impression from public info is they had a few isolated individuals experiment with it on small projects but the company never formally adopted it or supported it for anything serious.


Rust is loved and has momentum. I'm not sure it's universally adopted by pragmatists, partly because it requires replacing pip, Maven, node, yum, or whatnot with cargo.

Cargo is great if you're starting from nothing, but it's not very pragmatic to ignore all your existing code and experience and form another ecosystem within your organization. There's probably a limit to how many code ecosystems you want in your org, even in large and well-funded ones.


No, you're mistaking the point about pragmatists. It's not necessary for every pragmatist to adopt it for every project in order to succeed, just a critical mass of them.

In Rust's case, enough large, medium and small companies have adopted it that it continue to be a thing a decade from now. Google alone has a large vested interest in ensuring the continued development and ecosystem health of Rust, because it uses it for critical components in Android. Android isn't going away, nor will they rewrite their Bluetooth stack (https://android.googlesource.com/platform/packages/modules/B...) and Binder (WIP - https://lore.kernel.org/rust-for-linux/20231101-rust-binder-...) again in some other language. Now extend the same argument to the projects within AWS, Microsoft, Meta and others that depend on Rust and can't rewrite their Rust projects without a massive investment.

The point isn't that all new code at the pragmatists must be written in the new language. A few critical projects are, adoption within the pragmatists continues to grow with time. This allows a new company thinking of adopting Rust to easily answer the question - "Will it be around in 10 years? Will the ecosystem still be thriving? Will it be easy to hire for this?" It's a likely "yes" to all 3, because so many pragmatists are invested in making it so (https://foundation.rust-lang.org/static/publications/annual-... - See "Member Overview"). All of these companies are literally paying tens or hundreds of thousands per year to help the ecosystem flourish.


D has been around for 20+ years now.


It has, but it wasn't adopted by the pragmatists in that time.

It's hard to tell if the early adopters adopted it either

- It doesn't show up at all in the 2023 stack overflow survey (nor in the previous two years) - https://survey.stackoverflow.co/2023/#technology-most-popula...

- It doesn't show up in questions asked on Stackoverflow since 2008 - https://insights.stackoverflow.com/trends?tags=kotlin%2Crust...

- Nor on Google insights - https://trends.google.com/trends/explore?date=2012-07-31%202...

But maybe there is a different data set that shows early adopters adopting D. But for me, I would say a language has been adopted by early adopters when it reaches 1% of all developers in a reasonably large survey. Given that, it's entirely fair for a company making a language decision now to ask if D will still be around in 10 years.


>Meanwhile Rust managed to appeal to the pragmatists

I don't see that at all. I see it appeal to idealist.

At the end of the day I think it is more of an economic model rather than that distribution problem. You need a company to be the language power house.


All these languages (D, nim, crystal and many others) look nice but unfortunately the lack of adoption means few ressources and a small community which in turns means lack of adoption in a vicious circle. I now think the most important skill a language creator can have is to be able to create hype (as long as the language itself is good enough of course).


I agree with you on the hype front. Hype is necessary to help a language escape this vicious cycle. But then it’s kinda funny seeing developers being especially hostile to promotion/advertising of new languages.

I’ve seen a lot of protest on HN and Reddit from people who feel the currently popular language is being “shoved down their throat”. But if you’re not open to that we would be writing C forever, let alone C++ or Java (which had their own hype cycles).

> The world is often unkind to new talent, new creations. The new needs friends.


As an enthusiastic Julia user, I can definitely attest to this. It's frustrating to just want to share with people what you find cool about a language, only to be accused of being part of some astroturfing campaign, or for any discussion of the language being immediately derailed by people complaining about 1-based indexing.

Granted, there are definitely cases out there where Julia users have been overzealous and undersensitive when trying to convince others to come try out the language, but my experience with these people is that they're almost always coming at it from a place of just being excited about finding a language that works well for them and solved so many of their issues, and so they want to share that with others.

____

D is a cool language and it's a shame it seems to get ignored so much. I hope D people continue to work at it.


I think Julia has a very bright future. There's no fundamental reason for Python to be so popular in the number crunching sector. I expect a lot of that to migrate to Julia over the next decade. Unlike D, I believe it has significant funding.


I used to think likewise, until NVidia and Microsoft decided it was about time to make Python implementations actually embrace JITs.


If all Python needed was a JIT, this would have been addressed over a decade ago. The problem isn't a lack of a compiler, it's language semantics that make the job of a JIT nearly impossible.


The usual urban myth, ignoring JIT work in language semantics like Smalltak, Common Lisp, SELF, where anything goes and the whole process space can change in a method call.

JIT doesn't happen in Python because the community rather rewrites code in C, than support projects like PyPy.

NVidia and Microsoft had to come into play to change this.


None of the languages you mention there get even close to C levels of performance when you hit "anything goes" dynamic language features. Those JITs are only competitive (i.e. within an order of magnitude perf) when you are accelerating very barebones code on native types and non-generic functions, or you strip out dynamism.

In julia, there is no separation between generic functions and fast functions, and no separation between user defined types and inline allocated structs.

This is not the case with Python, Smalltalk, Common Lisp, etc.

To actually take hold and stop people from relying on C libraries for everything in Python, the Python JIT is going to have to be as fast as C, and it's also going to need to have a CPython compatible ABI because people are not going to abandon all these pre-existing libraries overnight.

Python's semantics make both of those prospects incredibly dubious.


So I am lost, are talking about Python JIT compilers or C performance?

Smalltalk, Lisp and SELF were used to write complete graphical workstations, where a debugger code change or dynamic code load across the network could impact JIT decisions across the whole OS stack.


Python JIT compilers don't exist in a vacuum. If you want one to take over, it needs to be worth using. So comparing the performance of that JIT compiler to C performance is very relevant, since the goal has to be to replace code which was written in C with Python code.

At the same time though, because Python has such a tremendously big ecosystem, all written in C, and targeting specifically the CPython interpreter ABI, even if you make a JIT compiler that is as fast as C, it'd also need to support the CPython interpreter ABI, otherwise it'd never catch on because the JIT compiled implementation of the language would have to start fresh with almost no ecosystem support.

The situation is even worse if we take the realistic view that such a JIT compiler will make some serious performance compromises relative to the existing C code everywhere in the ecosystem, and the fact that the CPython interpreter ABI is so entangled with the internal implementation details that it's impossible to support in a way that's even approaching being performant.


Which goes back to the point of the community not caring about JITs, which is embodied in the way Python libraries that are thin wrappers around C code, get called as "Python" libraries.

They are as much Python as they could be Tcl.

Ironically it sufficed to Microsoft and CUDA step in, followed by Facebook as well, Guido gets out of his Python retirement, and JIT, GIL-removal are suddenly issues that matter.


> So I am lost

Me too.

Do you wish to suggest that Smalltalk implementations had "C performance" ?

Do you wish to suggest that some Smalltalk programs should still be a little faster than corresponding Ruby +YJIT programs ? (But then NodeJS.)

Do you wish to suggest that Ruby +YJIT programs should be faster than corresponding CPython programs ?


I suggest that any JIT implementation will be faster than CPython default install, including the old ones at Xerox PARC, Genera and TI.

Calling out to C libraries doesn't count for CPython performance, that is C code, not Python, and any language with FFI can call into C libraries.


In context, something like: Julia won't take over because someone will make a sufficiently performant JIT for Python ?

(Name dropping ancient language implementations was confusing to me, and I'm ancient.)


That person seems to just think that so long as the JIT is kinda fast, it's good enough. But that's really not the world we live in most of the time, especially if using that JIT broke binary compatibility with existing fast libraries.


There was a Python JIT a decade ago - Psyco [1]. It was/is a lot faster than normal Python. The community, however, decided to go down the PyPy route.

[1] https://en.wikipedia.org/wiki/Psyco


There's a gigantic graveyard littered with dozens of abanoned Python JITs, Psyco is hardly unique. The community has not at all gone down the PyPy route, nobody uses PyPy either, even if it's technically not abandoned.

There's little reason to think these new JITs will fare any better.


Python doesn't need JIT, python's strengths is "it's interpreted", therefore it runs everywhere without having to compile anything in one go, JIT is only useful for servers and long lasting processes, python doesn't fall into that category

At best it needs an AOT compiler, to accelerate final scripts, and that's where projects like Mojo start to become interesting


According to benchmarks, JIT implementations are still very slow.


It suffices to be on par with other dynamic languages with a JIT.

I am also curious how Mojo will turn out, from the last LLVM developers conference talk, it is quite interesting after all.


I think biggest Julia problem is deployment and focus on Academia/HPC, its nice language and pretty fun to write, but it seems to be not good enough to replace Python and lost tons of momentu from hype train in 2017-2018?.

Julia still have a sucess story and is a lot bigger compared to D.


Julia is the only programming language that makes me annoyed when I think of it, because it was so troublesome to package for Arch Linux and because of how little they care about increasing the safety in the language (like Rust or Ada).


What made Julia so difficult to package? I'm genuinely curious.

> ... because of how little they care about increasing the safety in the language (like Rust or Ada).

I think it's a bit harsh to compare Julia to Rust on that front. Julia has completely different design goals. Compared to Rust's borrow checker and C++'s move semantics, Julia's immutability-by-default (+ clever compiler optimizations) gives a good trade off between performance, safety and ease of use. While Rust goes for safety + performance, Julia goes for performance + ease of use (respectively dynamicity).

Nevertheless, I miss some features related to type driven development in Julia, which would improve safety:

  - A newtype idiom
  - Sum types
  - Explicit, type-checked interfaces : I actually would prefer something like C++20 concepts (a blacklist approach) to Rust's trait (or typeclasses, a whitelist approach), because it would be a better fit for Julia's design goals.


The main problem with packaging julia is that it requires a custom version of LLVM (since LLVM ships breaking versions every 2 years and doesn't do downstream testing for projects other than Clang). Lots of linux distros have repeatedly tried shipping Julia with random LLVM versions and as a result, shipped broken Julia versions to people.


>I’ve seen a lot of protest on HN and Reddit from people who feel the currently popular language is being “shoved down their throat”. But if you’re not open to that we would be writing C forever, let alone C++ or Java (which had their own hype cycles).

This was Rust for a decade. It was really annoying. But now we have a wonderful new language with tons of support that could legitimately challenge C++ some day. So I tend to agree.


Universal Function Call Syntax, the syntax feature where you can call a function in a procedular language you can call function in a multitude of ways. For example (pseudocode):

``` func toUpperCase(s: string) <IMPLEMENTATION>

a = "hello world" echo toUpperCase(a) # HELLO WORLD echo a.toUpperCase # HELLO WORLD echo toUpperCase a # HELLO WORLD echo a.toUpperCase # HELLO WORLD ```

With the dot syntax, the first argument is the one indicated by the dot, and the rest of the arguments are in the parentheses. In nim, you can leave out the parentheses too, which makes `echo` statements cleaner.

It's a shame more mainstream languages don't have this feature.


(your formatting got a little lost but I get the gist)

I've done a little langdev and I've found that UFCS interferes with stuff like field access (person.name) or things like interfaces. Let's say you have a person struct like { name: str } but then also a Label interface like Label { name: fn () -> str }.

It's kind of a broader problem of being able to extend things in any place you might imagine, like I'll implement a trait on this struct here, I'll implement a function on this struct here, etc. etc. and before you know it your code is extremely nonlocal. There are ways to temper this, e.g. Rust combats this a little by having to have traits in scope for them to apply, but that's annoying in its own way.

I'm just saying it's not a free lunch, I think anyway.


Sorry about the formatting, typing this on my phone. The preview looked alright, but ig that didn't translate.

For the function-field interference, I'd probably just work it based on situation. And usually naming functions clearly as actions with verbs and fields/variables with nouns.


Sure yeah but like, you gotta deal with clashes. The problem itself isn't hard, but trying to get the ergonomics right is pretty tough, especially when you're talking about multiple dependencies that didn't coordinate at all with each other (this trait adds a "name", this one adds a "getName", this one adds a "display_name", this one adds a "get_name").


The mixing of snake_case and camelCase is solved by Nim by ignoring capitalization and underscores. This feature is often controversial when talked about on HN, but I love it. I like camelCase, but I understand why a python user would be more comfortable with snake_case. Of course one codebase should set its internal style to prevent faults in control-f.


UFCS only works well when the language also supports function overloading, which is a questionable feature by itself because it greatly complicates resolution processes and constrains other type system features. For example, function overloading in Rust will definitely conflict with traits (and Rust trait system is already complex enough). UFCS increases the complexity on top of that, and the resulting convenience is more or less marginal in my opinion.

Yes, it would be great to have a way to turn a method into a function. That doesn't necessarily mean that they have to be unified---an explicit conversion is enough.


UFCS is not the feature to call anything like `a.f(b)`, it's to make all functions callable the same way. Calling a method like `A::f(a, b)` is also UFCS and Rust does have that.


I'm specifically referring to UFCS as in D and Nim (possibly more). Rust's `A::f(a, b)` was also called UFCS in the past, but it is a distinct syntax and the term is no longer used [1]. The UFCS in question would look like `f(a, b)` instead, and conversely, you may be able to call an ordinary function `f` with `a.f(b)` as well. Rust allows neither of them.

[1] https://doc.rust-lang.org/reference/expressions/call-expr.ht...


D supports both `f(a,b)` and `a.f(b)`.


Why would you ever want the dotless version though, in cases where it's not terrible? (e.g when there are multiple arguments of equal importance and calls where the argument is appears more as an option for esoteric cases than the implementation's main subject)

I really like the option to "semantically scope" a function to its main subject, like in kotlin extension methods. But I fail to see why I'd ever want to have both options. Is it just a compatibility thing, to be able to call code written for conventional languages? Or from developers who don't like that approach?


Here's one case. Instead of

  let letters = word.chars().filter(|ch| ch.is_alphabetic());
you can just write

  let letters = word.chars().filter(is_alphabetic);
since the difference between "methods" and "functions" is purely syntactic.


Most languages that have a way to define dot-syntax functions that aren't actually methods of the type also come with the ability to use any zero-arg method (both true methods and dot-syntax extension functions) as a one-arg function reference, or rather n-arg methods as n+1-arg function references. No need to crowd the unqualified namespace just to be able to use them without wrapping in a lambda.


formatting the above function for better readability

``` func toUpperCase(s: string) <IMPLEMENTATION>

  a = "hello world" 
  echo toUpperCase(a) 
  # HELLO WORLD 
  echo a.toUpperCase 
  # HELLO WORLD 
  echo toUpperCase a 
  # HELLO WORLD 
  echo a.toUpperCase
  # HELLO WORLD ```


D has had UFCS for maybe a decade. It's a very popular feature.


It's in cpp2/cppfront I believe, for the record


I've been interested in D since about 2010 and I'm pretty sure it's about ten years older than that. So if there's a vicious circle, I'm not sure what the effect of that is.


Ruby is an interesting use case because the Rails framework was the hype train that pushed it into the mainstream. Also would Rust or GO have gotten as much adoption if not for the strong dependency system built right in? New languages clearly need more then just a compiler or runtime to attract developers now.


It also has to be very easy to use. There are a lot of languages I won't even look at because they're not intuitive, and there's no clear reason for me to invest the time to learn them. I think developers tend to focus on "innovations" that only nerds will care about, and neglect making innovations in how easy the language is to learn, use and read. At the end of the day these features are the main factors determining whether or not a language will be adopted.


https://sr.ht/~rne/dram was my first software in D, I don't remember having any issues with the language itself. Documentation (language and stdlib) was mostly ok, same with compiler diagnostics (watch out for long method call chains involving lambdas). I agree with the diagnosis elsewhere in the thread: D is where it is today (IOW, isn't where it could be) because of the Tango/Phobos schism and closed-source license issues from almost twenty years ago.


What D has is an excellent community. Many users have told me that the D community is just the kind of community they were looking for in a programming language. For example, we don't do politics. Our forums are D only.


V-style hype?


The nim language appears to have better gc and metaprogramming support while also offering most of the other advantages that D touts.

If someone more experienced in D could offer any insights around why someone should prefer D over nim for a new application outside low level applications where gc is absolutely undesirable, I'd really appreciate that. I am not an expert in either.


I don't know anything about nim.

One nice feature D has is you can:

    import mycfile;
and your C functions and data will be available in your D code.


{.compile: "myfile.c".} {.compile: "myfile.cpp".} {.compile: "myfile.asm".}

does the same in Nim.


D is used in prod :) I don’t know a single company in Europe which uses Nim. And it is definitely an easier to pick up language.


Nim is used to secure the Ethereum blockchain in the nimbus client.


You can program in D without ever needing the GC.


Same in Nim IIRC

Thats said, I am a huge fan of D and your work in general. Please keep creating awesome stuff!


I just went through the Nim website and didn't see anything about no GC - except literally turning it off which means it won't free anything, ever.

See https://nim-lang.org/1.4.0/gc.html

Do you have a page that shows otherwise??


Besides Tiberium's and mratsim's excellent points in siblings about more recent non-tracing automatic memory management with ARC/ORC, Nim has also supported manual memory management (via raw pointers with alloc / alloc0 and dealloc) since its very beginnings (but the stdlib uses AMM almost exclusively). So, if you only do manual management and turn it off with --mm:none, memory will be freed when you dealloc (with opportunity for mistakes anything manual entails). I believe D's non-"GC" mode is the same manual style.


That page is quite a bit outdated - Nim version 1.4.0 was released in October 2020. The newest Nim 2.0 release has the default GC set to ORC, which is ARC + a cycle collector, see https://nim-lang.org/docs/mm.html.

As for ARC/ORC specifically, you can read a small introduction in https://nim-lang.org/blog/2020/10/15/introduction-to-arc-orc..., but as a TLDR: ARC is something closer to RAII than a GC (destructors injected automatically at the end of scopes) with reference counting for reference types. You can switch between ARC/ORC when compiling, and destructors give a lot of control - https://nim-lang.org/docs/destructors.html.


I want to write low level code and was considering Nim because I want to compile to C, so I can benefit from Cosmopolitan Actually Portable binaries. Do you think Nim is feasible for this? Do you know other languages that could do that, it seems my options are VERY limited (despite there being thousands of languages, there's almost none that can compile to C and gives you good memory control)... I think Zig can't compile to C yet, Rust doesn't either (except for some experimental compiler) and other options are GC'd with a "heavy" runtime which is the last thing I want.


In a word, yes.

In more words: You should be able to use Cosmopolitan libc: https://github.com/Yardanico/cosmonim

If something does not work for you, Yardanico is super duper helpful in all things Nim.

Nim also compiles to Javascript (nim js) and C++ for integration with legacy codebases, but that is probably more to the side of your interests.

While I have never seen it done and always meant to give it a try, compiling to C should allow one to also do things like write Linux kernel modules in Nim without any special support from the Linux team.


Nim has GC and runtime and Linux kernel developers do not allow third party runtimes in the kernel. Even meager Rust's "panic" runtime is a contentious issue. Can one disable runtime in Nim completely -- no GC, no exceptions?


> kernel developers do not allow third party runtimes in the kernel. Even meager Rust's "panic" runtime is a contentious

Much in Linux is contentious :-) which is why the module system is nice. A kernel module for C code requires no permission from Linux-core unless you need it distributed with the kernel (which, yes, might be required for "credibility" - but critically also might not). It may require many decls to access various kernel APIs, but those can be (semi-)automated or just done as-needed. So, Linux kernel policy is not so relevant (at best) which is what I meant by "no special support" (admittedly brief). Kernel coding is always a bit trickier, and you may need to build up some support code to make integration nice, though as well as decl generators (though none of that need be distributed "with Linux").

> Can one disable runtime in Nim completely -- no GC, no exceptions?

To answer your question, and as discussed elsewhere in this subthread, Nim has many options for memory management.. only stdlib seq/string really needs automatic methods. One can disable the runtime completely via os:standalone and statically check that no exceptions are raised with Nim's effect system (and there are also both setjmp & goto based exception impls which may/may not be workable in Linux/BSD kernel module settings).

As "proof more by example", a few people have written OS kernels in Nim recently[1,2] and there was another toy kernel long ago[3]. People have also written OS kernels in Go which "has a GC and runtime".[4] So, I acknowledge it's not quite the same example, but I also see no fundamental blockers for kernel modules.

[1] https://github.com/khaledh/axiom

[2] https://prosepoetrycode.potterpcs.net/2023/01/a-barebones-ke...

[3] https://github.com/dom96/nimkernel

[4] https://github.com/mit-pdos/biscuit/


The memory management is type-bound:

type Foo = object -> no GC, stack object, can use destructors type Foo = ptr object -> manual memory management type Foo = ref object -> "GC", refcounting to be exact.


Roger wilco!


I'm intrested in your claim of nim's better metaprograming support. As I've stuck with D for its metaprograming abilities. Have any good starting points?


The free online book nimprogrammingbook.com has chapters on templates [1] and macros [2] which explain them with nice examples.

From what I gather from the D docs (which I have only casually browsed through), the kind of adhoc ast construction that nim macros enable is not possible in D.

[1] https://www.nimprogrammingbook.com/book/nimprogramming_rouge...

[2] https://www.nimprogrammingbook.com/book/nimprogramming_rouge...


D does not support macros quite deliberately, because macros make for incomprehensible code.


Does D have real adoption? Their success stories page lists a few bigger companies that use it for some smaller projects, but nothing about use in big production systems. Is anyone on HN primarily a DLang coder?


It is or was used by some companies in Munich (funkwerk) and a few others. Companies like ebay and netflix have some projects in D. The users are mostly C++ veterans so the forums are full of experienced ppl who can help. I used it heavily in my work for local scripting tasks and even write a small GUI app for our API. Used it mainly instead of Python. It’s basically a safe C with a rich stdlib. There is almost zero syntax overhead and hence if you know C, you should be fluent with D in short time. It is a pragmatic language with some nice functional syntax sugar that makes Scala cry. But the language and the ecosystem struggles with the lack of ppl and overall direction. Contributors tend to complain about long times and endless PR discussions. Some D contribs moved eventually to Rust or Go, which is sad but here we are. I think D deserves more attention than it got and continues to get.


I write D fulltime for my small company, published a framework that 4 other small companies use (or have used). D is really nice for the small ISV, but is also prepared to handle massive codebases (deprecated, packages, gradual typechecks, clear deprecation policies).


The D language has almost become a German preserve, what with the only major adopters being from there. Deutsche Bahn are supposedly using it too for their platform signage.


No doubt Sociomantic Labs, which was acquired in 2014, was a major driving force there. The company was very succesful and hired a lot of the best developers interested in D at that time.


That could also explain the abraunegg-onedrive (a brown egg in German) package for Arch Linux, which is the source of DLang deps on my system :D


Symmetry Investments uses a lot of D (and pays for dconf), from London.


>It is or was used by some companies in Munich (funkwerk) and a few others. There is almost zero syntax overhead and hence if you know C, you should be fluent with D in short time.

Das C. ( D as C )


> netflix

I remember Vectorflow for sparse neural networks.


I am. Sometimes I write some test cases for D's builtin C compiler, but it's been all D for a while. The compiler itself used to be written in "C with Classes", but it's all D now.

We have a wonderful conference every year in London. The last one was particularly nice - lots of people, everyone had a good time (especially myself).


In fairness, you kinda don't count ;)


Counting is for beginners. I'm into integrating.


A key to tackling big projects is deciding which potentially desirable features have, in fact, measure zero?


From a different direction, it was taught in one section of my university's data structures course. We were guinea pigs (https://forum.dlang.org/post/km96ho$2grm$1@digitalmars.com), but I recall the experience being fun and students largely enjoying D. Transitioning to C++/Java in later coursework and professionally was a breeze.


A few other folks (including myself) are or otherwise continue to teach D.

https://youtu.be/V2YwTIIMEeU?si=To2DBlzz30XAUptN

https://dlang.org/blog/2022/02/19/how-i-taught-the-d-program...


I am. Funkwerk, where I work, uses near-exclusively D in the backend.


How did the company end up exclusively use D un the backend?


I wasn't here for it, but my understanding is that our previous boss really liked D.

(Once you have an internal D ecosystem, you start selecting for D developers, who then prefer to keep using D, and so it persists.)


That’s so interesting, I really believe D deserves more highlight because it’s so easily adopted and interops so well with C, yet it’s very high level enough to be scriptable.


Symmetry is a serious trading firm that show up to DConf and similar. Some of their engineers are very enthusiastic about the language and a little bemused that so many people use C++ instead.


I remember that D was instrumental for building the core product of weka.io, a fast distributed filesystem.

Not one of the tech giants, but not a small company either: https://www.weka.io/


Yes! I write D full time at my company/organisation Tagion.org.


I worked for a storage company whose product was written in D. They were later acquired by a larger international company, and I don't know what the fate of the original project was after the transition.

In this world, D was, while not a common choice, also not an unexpected one. The most common language for these sorts of products is C... but it's not the kind of C you'd find in books about the language. It's C that's heavily modded to remove or replace a bunch of stuff. D just happened to align well with what people running the project would've done to C anyways.

Also, in this context the existence of libraries doesn't matter, as almost everything is going to be written from scratch anyways. Nor does learning the language matter, since the internal infrastructure and learning how C was modified to fit the project goals would take the same time as learning another language.

To be honest, I really liked to work in that environment. Not because of any of D's features. I just hate being in the constant fear that I will not be allowed to do something that makes perfect sense, and instead coerced into idiotic workarounds that pretend to solve the problem. Which is what happens when you are required to use third-party libraries and they absolutely don't anticipate your use-case. It's the place where I was closest to being proud of what I was doing. Unlike most day jobs I had, where I felt like I need some extra time in the shower just to not feel dirty by writing code I knew full well should've never been written.


Ahrefs seems to be using it to some extent too: https://ahrefs.com/jobs/backend-engineer---sre

> Our backend is mostly implemented in OCaml with some D and C++.


I would like to be. It's very nice and clean.


IIRC Kenta Cho had quite a few of his games written in D.

https://www.asahi-net.or.jp/~cs8k-cyu/


It was because I played Torus Trooper and the source files came with the game, that I found out that D existed.

Sadly never had the opportunity to try to use it.


Kenta cho is a absolute gem


D community spent most time argue something not important and miss 2 big opportunities.

1. Cloud - D is perfect for serverless if D improve its GC and standard library, language like go probably won't even exists in first place.

2. Mobile - in early day, mobile is slow and has low memory, If D has good tools and GUI framework, it will take off. Even today, it's difficult to crate app for mobile using D.

LLVM probably the last straw that D out of favor, before LLVM, crate product ready language is difficult, now, everyone is creating their own language...


A couple of remarks.

For all the praise LLVM gets, already in the mid-70's compiler based frameworks were a thing, IBM PL.8 RISC project, and early 1980's Amsterdam Compiler Toolkit are two notable examples of similar stacks.

Go would still exist, because Google would never picked D for their cloud projects, and its adoption grew from Go creators working at Google. See how sucessfull Oberon and Limbo projects from the same authors were, before their Google days.

While C++, Java and .NET now have most of the features that D was known for when Andrei Alexandrescu published his Programming in D book, I still think the execution in D is better packaged, although yeah, wihtout the related ecosystem and IDE tooling, it is an hard sell.

Maybe "slow and steady" still will surprise us, like it took Rails to make Ruby known outside Japan, or maybe it is too late no matter what.

Future will tell.


A compiler translates from language A to language B, hopefully improving it along the way. If you let A = B, you get the compiler pipeline model. It's popular because it's effective and obvious.

LLVM has the core IR structure as an in memory structure and as a text format and as a binary format. The easy, lossless (modulo partial implementation of new features) conversion between the formats was a really big deal. I'd be interested to hear of prior art on it.

In particular, it means developers can diff the IR between passes and generally apply textbased tooling to it during debugging, and you can do things like link time optimisation really easily by combining separate files then running them through the optimiser.

LLVM is creaking a little under its age, and I don't think it's ideal that it's written in C++, but the flexible architecture is legitimately better than I've seen in other compilers. In particular, it has been a competitive edge over GCC.


One that I left out, Project Phoenix from Microsoft Research, using MSIL for similar purposes, with C# and C++ compilers.

Unfortunely when they killed the project, little was left on the Internet, and they have other projects with the same name.

https://en.wikipedia.org/wiki/Microsoft_Phoenix

https://devblogs.microsoft.com/cppblog/channel-9-video-andy-...

The paper "An overview of the PL.8 compiler" refers to how they implemented a multi-stage IL pipeline to write a mostly safe systems programming for IBM RISC project.

https://dl.acm.org/doi/10.1145/800230.806977

Unfortunely there isn't much publicly available, and IBM eventually pivoted the RISC efforts into AIX, thus abandoning this effort.

Amsterdam Compiler Toolkit used EM intermediate language as its bitcode format,

https://en.wikipedia.org/wiki/Amsterdam_Compiler_Kit

If you delve into ACM, IEEE, SIGPLAN and related stuff, there will be similar projects, LLVM ended up getting the spotlight thanks to Apple and Google's sponsorship, followed by others in the industry.

I still hope that GraalVM (nee MaximeVM at Sun Research Labs) will keep going, as it is yet another approach for compiler toolkits, with a safer language.


>One that I left out, Project Phoenix from Microsoft Research, using MSIL for similar purposes, with C# and C++ compilers.

Can't CoreCLR be used for the same stuff?


Not really, this was something like LLVM.

For some workflows, it might be doable, for the whole package, what people use compiler frameworks for, CoreCLR still lacks many knobs.


nit:

project oberon, though in many ways related, is by wirth.

you were probably thinking of alef (the early csp language) on plan9 (the bell-labs' research unix successor).

limbo and its os/vm inferno were an attempt to commercialise those ideas amid the java craze.


Only if you ignore all the Oberon versions that came after 1.0.

So if we are being pedantic, Oberon-V,

https://www.research-collection.ethz.ch/bitstream/handle/20....

Inferno is Plan 9's successor, and Limbo embodies the ideas that Alef failed to deliver, replaced by a C based library instead.

Regardless if Inferno and Limbo were a way to fight against Java based OSes, they failed in the market.


> Oberon-V [robert griesemer's phd]

thanks! i didn't know of this connection.


Which is basically the reason why Go has Oberon-2 method syntax and SYSTEM package as unsafe.


Even if D still hasn't achieved its full potential of wide adoption, it is great to see it going along, "keep it slow and steady".


D users tend to be very loyal.


This is what C++ should have been, C with modules and proper compile time type introspection, D is a great language, one thing I wish it had: native tuples and pattern matching, but I heard it's planned, so that's great


I looked into it a few years ago and used it for a couple of hobby projects. I really liked it but I was never going to be able to persuade my company to use it unless it became more popular.

Rust is trying to solve the same problem of being both safe and as performant as C/C++. Although it came along later than D it seems to have been much more successful in gaining traction, most likely because it had the backing of Mozilla.


I would argue that Rust's success compared to D is largely due to its solution to a major problem in C++: memory safety. Beyond that, Rust and D both offer several improvements over C++ in various ways. However, these incremental enhancements are often insufficient to motivate a transition to new languages in an industrial context. This is because it's challenging to justify such a shift based solely on minor improvements that mostly enhance developer quality of life, especially considering the risks associated with interoperability with existing codebases, the availability of skilled developers, and the general preference for conservative approaches. Even with a significant improvement like memory safety, which significantly ups the value proposition, it took nearly a decade after Rust's 1.0 release to really gain traction. The power of established practices is formidable.


People who are at home in a memory managed language tend to have little interest in a language that is slightly more low-level, but still memory managed. I believe that this is the main contributor to Rust's rise. Even if I suspect that most production Rust code is from people coming from the memory managed side, looking for a safer way to avoid any performance compromise.


The quest to improve Java and .NET low level capabilities, Swift, Nim, Chapel, Linear Haskell and OCaml effects, show otherwise.

Rust's sweet spot is on the OS layer, or bare metal workloads, where similarly to high integtry computing, no heap allocations are allowed, or only in very controlled scenarios.


I didn't say that there's no desire for low level capabilities: without that, nobody from managed environments would care about Rust. But to overcome the skillset inertia that keeps people in the language they are already good at, the gap needs to be bigger than "it's still gc, but the runtime is slightly more lightweight". I'd rather consider those projects as evidence of how high that "different enough" threshold needs to be.

Back in 2003 I loved the idea of D, even if I never used it. But then I also loved the idea of C++/CLR, so I would not put too much on my judgement. My opinion about D has changed far less: still have a soft spot for it, just not enough to make the jump


I used D as a hobby back when D1/2 split was not yet fully resolved, probably around 2008--2009. I recall that I was never sure if I should switch to D2 right now, as I wrote quite a bit of code in D1. My original goal was to port some of my games from C to D, as I was already familiar with PARSEC47 written in D1. But the ecosystem was still at the early stage, so my project quickly derailed and I ended up writing some alternative standard library for personal use [1].

Rust hit 0.1 around the same time, and my attention ultimately turned to Rust when it hit 0.5, the version that cemented a concept of lifetime and borrow checker. The same thing happened for the same reason, that's the reason I built Chrono and other well-known libraries at the first place. But I think I sticked to Rust probably because it had a very crude but working package manager in the earliest release. (While it was initially named Cargo, it was renamed to rustpkg and then replaced with a new Cargo shortly before 1.0.) So I had some reason to continue working on my libraries, because people was actively looking for features provided in them while I haven't seen any such movement with D.

I still don't know whether Mozilla was crucial for observed differences between D and Rust. I do believe that Rust needed Mozilla to succeed, but that looks orthogonal to my anecdotes. My current guess is that Rust was the first major programming language that was entirely hosted by Github from the beginning. [2] That arguably made people much easier to search Rust libraries and collaborate on missing pieces. And that's probably what allowed Rust to evolve during multiple breaking changes before 1.0, and a timely introduction of the current Cargo also played a role. [3]

[1] Fun trivia: Some of my (in)famous Rust libraries are originated from those experiences!

[2] In comparison, Go switched from Google Code to Github in late 2014 (https://groups.google.com/g/golang-dev/c/sckirqOWepg/m/YmyT7...). D proudly hosted its own news group even back then I think.

[3] Of course it took a lot more time for Rust to become a language that can never go away. That point is generally thought to be an introduction of `async` in 2019, because I've been told multiple times that it was the last major requirement shared by many stakeholders.


D has a bigger scope than Rust. It is Rust‘s scope plus use cases where a garbage collector is appreciated and it provides more meta programming mechanisms.

It might be the most comprehensive programming language ever.


What are the key metaprogramming features you miss when not using D? How do they compare to Nim's?


Although I haven't used Nim, something I miss while using rust is `static if`. You can sort of approximate / do a subset of this with #[cfg(xxx)] directives in Rust.


`when myCondition():` instead of `if myCondition:` is done at compile-time.

Alternatively you can use a `static:` code block to force compile time evaluation. Or tag a function {.compileTime.} or tag function inputs with `static` modifier.

It is possible to create a compiler or an assembler running fully in Nim macros as well:

- https://github.com/mratsim/constantine/blob/master/constanti... (all that file runs at compile-time)

You can also implement Continuation-Passing-Style transformation at compile-time: - https://github.com/nim-works/cps

Or a neural network DSL or for a self-contained example, einsum: - https://github.com/mratsim/Arraymancer/blob/master/src/array...

It's worth noting that nim async/await transformation is fully implemented as a library in macros.


FWIW, the Nim analogue of `static if` would be `when` statements.

EDIT: or any of mratsim sibling's fine choices. Nim is Choice and has many uncommon compile-time powers.


Rust put the focus on the right place, perhaps not entirely on purpose at first, but having learned that this works they've doubled down.

Culture is first. If you have a safety culture, that supports and enhances safety technology, and the resulting software has better safety properties than you'd get even if your technology had been just as good without the culture. If you start with the technology instead a culture which isn't interested just undoes all your good work.

Look at C++ span. This is a slice type, roughly equivalent to Rust's [T]. As originally proposed it has good safety properties, its index operators are bounds checked and it provides safe but fast iteration. WG21 got hold of it, and std::span, the resulting standardized feature in C++ 20, doesn't have any bounds checks, destroying the safety properties. That's a product of a culture which doesn't value safety.


I like D. I just wish it didn't try to chase every trend. Rust has memory safety? Oh we'll add @live and @safe which will kind of implement some of that. Oh, no one wants to put 5 annotation on every function and most already written code doesn't use it anyway? Too bad.

I always viewed D as a new language, which is kind of trying to offer the convenience of C# but as a native language. But sometimes I feel like it's drifting in a different direction. In a direction of nicer C++. I would prefer C/C++ support to be a thin abstraction layer for legacy code. But instead seems like there is more and more integration with C++.

Even language features in D like copy constructors are based on matching the behavior of C++. If I wanted to use C++, I would just use C++ instead of bothering with bridging C++/D code. I think that's one of the appealing things about languages likr Rust and Zig, is that they don't even try to appeal to C++ programmers. Let C++ programmers enjoy C++, but let everyone else move on.


D is not doing concepts !


For any D Lang linux nerds out there looking for a D Lang challenge, Tilix terminal is looking for a maintainer - https://gnunn1.github.io/tilix-web/.

Looking for a new maintainer post - https://github.com/gnunn1/tilix/issues/1700

Tilix is a great terminal which somehow balances ease of use whilst catering to power users as well. IMO that is.

Note: Not affliated to the project. Just love using Tilix.


One of the finest overlooked features of D is the D community. It's composed of wicked smart people, nice & friendly and always willing to pitch in and help out. I must say I never expected that when I started out with D. Attending our annual D conference in London is one of my great pleasures - a solid week of interacting with my friends and colleagues from all over the world.


Is there an effort to provide formal verification features for a safe subset of the D language similar to ADA’s SPARK and related tooling?


There is a D subset:

https://dlang.org/articles/safed.html

But nothing like Ada SPARK provability stuff, I don't think.


At first my university use python for the basic programming course but halfway we integrated to dlang and we were really confused on what dlang actually is. The lack of tutorial and coming from the python language is really a challenge to learn D. We haven't gotten any big projects mostly just solving stuff in codeabbey (mainly because it's one of the only website that support D but we are maybe moving into codeforces because they recently added dlang support), all in all i'm still really confused like there's nowhere i can go sometime to ask about this but i've been trying to get back at c and it definitely helps

well wish me luck for my university course, i've been reading a lot of programming language before coming to college and i thought i had everything prepared but turns out life can really throw a curveball at you

P.S if you have any guide on how i can approach this let me know :)


Come to the forums at https://forum.dlang.org/ and we can help!


Will do, I've never been on the forums before but I might start looking into it


I used this website to learn D: http://ddili.org/ders/d.en/


Yeah I've been using this too with Mike Shah videos


Bookmarked


Continuing to add to my playlist on the language: https://youtube.com/playlist?list=PLvv0ScY6vfd9Fso-3cB4CGnSl...


Thank you so much, we always use your resources and it's basically our main go-to if we don't understand anything from dlang.org (which happens majority of the time) I hope you can keep making videos and maybe start doing some programming problems so we can better understand where each syntax is used

I am truly thankful and will continue to binge watch your vids


Here's also a GPT for learning D (might require a ChatGPT subscription): https://chat.openai.com/g/g-QHX6wnuyX-d-teacher


Yeah this looks really good but sadly needs a subscription, not really viable for the college me now but who knows maybe some of my friends will buy a subscription and we can learn together

Thank you for this normal gpt like ver 3.5 or ver 4 (from bing) is really bad at coding with d so this Will help a lot


There is also a reddit: https://old.reddit.com/r/d_language/


this seems a bit bizarre to me - if you are teaching basic programming then you should stick with one language. what is this "university"?


It's bizzare to me too, we changed professor halfway through the semester because the course is being teached by two lecturers. One of the reasons that he listed is python doesn't really train your critical thinking and our seniors who is taught python is worse than the generation before them that is being taught in c/c++, but the professor want a more "modern" language that's why they chose dlang (also dlang is way harder to got because of its "obscure" state) so here in Jakarta state university dlang is our main language for programming


Is the D language tour down? The page is not loading for me: https://tour.dlang.org/


It's been down at least since Sunday.


They are switching to new infrastructure. It's been down since second half of November.


Yeah, seems down.


I am really curious to how much load the server have to handle in order to survive HNs hug of death


I mean, that site is just down occasionally. A bunch of auxiliary infrastructure is hosted by volunteers that aren't necessarily available 24/7. It's an issue.


Volunteers?? Why host it like that? :(

For the traffic D gets these days, it seems like hosting ti as static content on Netlify or somewhere would be viable for a low cost.


Can I contribute with hosting somehow?



It's been down for a while unfortunately.


Back up now.


My understanding is that garbage collection in D is performed by a "stop the world" non-generational collector. Although GC is optional in D, without a more competitive implementation, the safest thing is to never depend on it. What advantage does D then have over languages like Rust and Zig?


D's garbage collector is quite safe. Most non-trivial programs use a mix of stack allocation, RAII, malloc-style allocation, and GC. Each style has aspects they're best at.

GC is particularly well-suited when using compile time function evaluation, which executed D code at compile time (i.e. adanced constant folding, where functions can be called).


My remark regarding "safety" wasn't with respect to the quality or correctness of the implementation. My concern is the case where you've found yourself in a situation where you're relying too much on GC, and the application is experiencing long pauses. Changing the design to rely more on manual allocation is one way out, but having more GC choices would be nice too.


C/C++ interop and easy static linking makes it super nice for writing small utilities IMO


Are there GC's that doesn't halt the execution actively in use?


There is only one like that, SGCL for C++ https://github.com/pebal/sgcl


Java's ZGC and Shenandoah collectors suspend threads only briefly -- less than one millisecond regardless of the heap size.


How come more languages doesn't implement non-halting GC?


The garbage collector must lock threads if it is compacting or does not create separate stacks for pointers only. All GCs for Java a lock application threads.


D needs to focus more on rebranding and reinforcing why it's deferent than the predecessor

Instead of calling it D, they should use a word to invoke some emotion, rebranding to "Destiny" would be a good start


I think the branding hurt it for years. It was perceived as a variant of C++. That didn't work, because it's definitely not C++, so C++ programmers would say "but it does X different from C++ so it's wrong". And nobody else wanted C++.

It's a variant of C, even to the point of compiling C code. The downside to that is that C isn't exactly the hottest language among people under 50.


Guess how I found out about D lang?

I was lookg through the history of C on Wikipedia, then I somehow stumbled upon a table with all programming languages in the wiki.

Under the C language, I saw D lang. I first thought it was an old outdated language like B.

But when I looked up D lang, I saw they had a website up and running, I visited it and got quite shocked, the website was modern and had an impressive description.


Developers probably don’t make choices based on the emotional valence of the name of a programming language.

Although, perhaps my decision to learn C# was a reaction to my desire to no longer be myopic…


Not consciously, but there are many reasons why D as a name detracts from adoption.

The most important is search-ability (e.g. compare results when searching "c# gc" and "d gc"), but there are also subconscious impressions that you get from a name, whether you realize or not. For example, having an indistinct name makes your language feel indistinct.

I understand your C# comment was meant tongue-in-cheek but I think it shows how you're analyzing this rationally while marketing is dealing with irrational decisions that happen instantly at a subconscious and emotional level.

Marketing is important.


> e.g. compare results when searching "c# gc" and "d gc"

D and Go share this problem, but they also share the solution. You wanna search for "dlang gc".

C# was impossible to search for when it came out btw, and for a while after. That one was solved by search engines eventually rather than by the community.


The world is path dependant. You'd be surprised.


Any idea if F# is down the alphabet to be after D?

C,D,'E',F

Is there an E language? Maybe there is a rule that languages can only use consonants?


Its predecessor being the so-called D1?

Or is this an elaborate joke about how annoying it is/was to search for Rust game development topics on the web? (Because there's a game called "Rust" and a game called "Destiny" ... `is destiny better than rust for multiplayer games`)


This is really enough to make it on the front-page of Hacker News? Posting a link to a programming language that is around for 16 years and not exactly an insider tip.


Yeah, what made it blow up this time?


Which IDE best supports D? I’m a fan of JetBrains products in general but would happily install something else for better ergonomics.


VSCode with a code-d plugin.


I remember that extension eating 3 GB of ram on my macbook because I had format-on-save on.


This plugin has been worked on and the issue could have been fixed by now. I rarely used VSCode for D because the language doesn're really need an IDE support like Java or Scala. But I do agree that for beginners and IDE support is important. I use vim with DCD: https://github.com/dlang-community/DCD


Just checked, the issue does not persist anymore.


VisualD if you're accustomed to Visual Studio classic it's pretty much the same.


D has a language server (LSP) you can use any IDE/editor that supports it

- Sublime Text

- vscode

- vim/nvim

- emacs

etc..

https://github.com/Pure-D/serve-d/


There D wiki has this list: https://wiki.dlang.org/IDEs


I'm curious to know what people think of D versus Zig, two rather new languages that seem to have the same target audiences and purposes.


Zig provides a smooth transition from C and has a built-in C compiler and also is excellent at cross compilation. D is much older and is more like a C++ alternative. Zig is more like a C alternative.


I would recommend creating learning courses on linkedin, udemy, etc.. that's where people go to learn new stuff, if you are not present on those planforms Might As Well be a Dead language


I would probably have gone all-in on D if Calypso hadn’t been apparently abandoned. Complete C++ interoperability is basically an unsolved and fairly important problem.


I wish at some point universities replace C with D.


I’ll ask my professor if he would mind



D is to C++ what Kotlin is to Java.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: