Hacker Newsnew | past | comments | ask | show | jobs | submit | moefh's commentslogin

Yes. A number is transcendental if it's not the root of a polynomial with integer coefficients; that's completely independent of how you represent it.

We don't know that. We don't even know if there's selection bias.

The article says the research was "focusing on 246 deceased drivers who were tested for THC", and that the test usually happens when autopsies are performed. It doesn't say if autopsies are performed for all driver deaths, and it also doesn't say what exactly is "usually".

If (for example) autopsy only happens when the driver is suspected of drug use, then there's a clear selection bias.

Note that this doesn't mean the study is useless: they were able to see that legalization of cannabis didn't have impact on recreational use.


> The fact that the correct type signature, a pointer to fixed-size array, exists and that you can create a struct containing a fixed-size array member and pass that in by value completely invalidates any possible argument for having special semantics for fixed-size array parameters.

That's not entirely accurate: "fixed-size" array parameters (unlike pointers to arrays or arrays in structs) actually say that the array must be at least that size, not exactly that size, which makes them way more flexible (e.g. you don't need a buffer of an exact size, it can be larger). The examples from the article are neat but fairly specific because cryptographic functions always work with pre-defined array sizes, unlike most algorithms.

Incidentally, that was one of the main complaints about Pascal back in the day (see section 2.1 of [1]): it originally had only fixed-size arrays and strings, with no way for a function to accept a "generic array" or a "generic string" with size unknown at compile time.

[1] https://www.cs.virginia.edu/~evans/cs655/readings/bwk-on-pas...


It was always considered bad not (just) because it's ugly, but because it hides potential problems and adds no safety at all: a `[static N]` parameter tells the compiler that the parameter will never be NULL, but the function can still be called with a NULL pointer anyway.

That's is the current state of both gcc and clang: they will both happily, without warnings, pass a NULL pointer to a function with a `[static N]` parameter, and then REMOVE ANY NULL CHECK from the function, because the argument can't possibly be NULL according to the function signature, so the check is obviously redundant.

See the example in [1]: note that in the assembly of `f1` the NULL check is removed, while it's present in the "unsafe" `f2`, making it actually safer.

Also note that gcc will at least tell you that the check in `f1()` is "useless" (yet no warning about `g()` calling it with a pointer that could be NULL), while clang sees nothing wrong at all.

[1] https://godbolt.org/z/ba6rxc8W5


Interesting, I wasn't aware of that and thought the compiler would at least throw up a warning if it had seen that function prototype.


It's not intuitive, although arguably conforms to the general C philosophy of not getting in the way unless the code has no chance of being right.

For example, both compilers do complain if you try to pass a literal NULL to `f1` (because that can't possibly be right), the same way they warn about division by a literal zero but give no warnings about dividing by a number that is not known to be nonzero.


Right, so if the value is known at compile time it will flag the error but if it only appears at runtime it will happily consume the null and wreak whatever havoc that will lead to further down the line. Ok, thank you for pointing this out, I must have held that misconception for a really long time.


Note that the point of [static N] and [N] is to enforce type safety for "internal code". Any external ABI facing code should not use it and arguably there should be a lint/warning for its usage across an untrusted interface.

Inside of a project that's all compiled together however it tends to work as expected. It's just that you must make sure your nullable pointers are being checked (which of course one can enforce with annotations in C).

TLDR: Explicit non-null pointers work just fine but you shouldn't be using them on external interfaces and if you are using them in general you should be annotating and/or explicitly checking your nullable pointers as soon as they cross your external interfaces.


Wow, that’s crazy. Does anyone have any context on why they didn’t fix this by either disallowing NULL, or not treating the pointer as non-nullable? I’m assuming there is code that was expecting this not to error, but the combination really seems like a bug not just a sharp edge.


Treating the pointer as not-nullable is precisely the point of the feature, though. By letting the compiler know that there's at least N elements there, it can do things like e.g. move that read around and even prefetch if that makes the most sense.


Indeed, at a minimum you should be able to enforce that check using a compiler flag.


You can add that check using -fsanitize=null (and you may want to turn the diagnostic into a run-time trap)


> It probably shouldn't do that if you create a dynamic library that needs a symbol table but for an ELF binary it could, no?

It can't do that because the program might load a dynamic library that depends on the function (it's perfectly OK for a `.so` to depend on a function from the main executable, for example).

That's one of the reasons why a very cheap optimization is to always use `static` for functions when you can. You're telling the compiler that the function doesn't need to be visible outside the current compilation unit, so the compiler is free to even inline it completely and never produce an actual callable function, if appropriate.


Sadly most C++ projects are organized in a way that hampers static functions. To achieve incremental builds, stuff is split into separate source files that are compiled and optimized separately, and only at the final step linked, which requires symbols of course.

I get it though, because carefully structuring your #includes to get a single translation unit is messy, and compile times get too long.


That’s where link-time optimization enters the picture. It’s expensive but tolerable for production builds of small projects and feasible for mid-sized ones.


That's one major reason why I don't like C++. I think the concept of header and implementation files is fine, but idiomatic C++ code basically makes it broken. Surely a class should go into the implementation file? (Internal) Types belong into the implementation, what belongs into headers are interfaces and function signatures. A class is a type, so it does not belong into a header file.


[[gnu::visibility(hidden)]] (or the equivalent for your compiler), might help.


> It can't do that because the program might load a dynamic library that depends on the function

That makes perfect sense, thank you!

And I just realized why I was mistaken. I am using fasm with `format ELF64 executable` to create a ELF file. Looking at it with a hex editor, it has no sections or symbol table because it creates a completely stripped binary.

Learned something :)


> Special Relativity (non-accelerating frames of reference, i.e. moving at a constant speed)

Sorry, but this is a pet peeve of mine: special relativity works perfectly well in accelerating frames of reference, as long as the spacetime remains flat (a Minkowski space[1]), for example when any curvature caused by gravity is small enough that you can ignore it.

[1] https://en.wikipedia.org/wiki/Minkowski_space


That's not great context: China and India have huge populations, it's expected that they should be at the top.

Better context can be found here[1] (countries by emission per capita). It's still not great because it shows a lot of small countries at the top. For example: Palau is the first, but it has a population of a few thousand people, so their emissions are a rounding error when compared to other countries.

[1] https://en.wikipedia.org/wiki/List_of_countries_by_carbon_di...


Per capita isn't the useful metric in this regard for the reason Palau illustrates. The climate cares about volume.

Per capita emissions is a way to assign relative sin by those who feel guilty about living large.

Bill Gates today, "This is a chance to refocus on the metric that should count even more than emissions and temperature change: improving lives. Our chief goal should be to prevent suffering, particularly for those in the toughest conditions who live in the world’s poorest countries. The biggest problems are poverty and disease, just as they always have been. Understanding this will let us focus our limited resources on interventions that will have the greatest impact for the most vulnerable people.”


Why? I would expect China to be at the top since it's #1 manufacturing country? But India is like behind Germany at (5).

How about GDP per emission? And that would make China way higher than US.

https://ourworldindata.org/grapher/co2-intensity


> Another respect is that C allows omitting curly braces after an if-statement, which makes bugs like https://www.codecentric.de/en/knowledge-hub/blog/curly-brace... possible.

This is a silly thing to point to, and the very article you linked to argues that the lack of curly braces is not the actual problem in that situation.

In any case, both gcc and clang will give a warning about code like that[1] with just "-Wall" (gcc since 2016 and clang since 2020). Complaining about this in 2025 smells of cargo cult programming, much like people who still use Yoda conditions[2] in C and C++.

C does have problems that make it hard to write safe code with it, but this is not one of them.

[1] https://godbolt.org/z/W74TsoGhr

[2] https://en.wikipedia.org/wiki/Yoda_conditions


And how many software are compiled with zero warnings?

And how many C programmers ignore such warnings because they are confident they know better?


It seems like you're trying to fix a social problem (programmers don't care about doing a good job) with a technical solution (change the programming languages). This simply doesn't work.

People who write C code ignoring warnings are the same people who in Rust will resort to writing unsafe with raw pointers as soon as they hit the first borrow check error. If you can't force them to care about C warnings, how are you going to force them to care about Rust safety?

I've seen this happen; it's not seen at large because the vast majority of people writing Rust code in public do it because they want to, not because they're forced.


> This simply doesn't work.

I think it works, and quite well even. Defaults matter, a lot, and Rust and its stdlib do a phenomenal job at choosing really good ones, compared to many other languages. Cargo's defaults maybe not so much, but oh well.

In C, sloppy programmers will generally create crashy and insecure code, which can then be fixed and hardened later.

In Rust, sloppy programmers will generally create slower and bloated code, which can then be optimized and minimized later. That's still bad, but for many people it seems like a better trade-off for a starting point.


> In C, sloppy programmers will [...]

> In Rust, sloppy programmers will [...]

You're comparing apples to oranges.

Inexperienced people who don't know better will make safe, bloated code in Rust.

Experienced people who simply ignore C warnings because they're "confident they know better" (as the other poster said) will write unsafe Rust code regardless of all the care in the world put in choosing sensible defaults or adding a borrow checker to the language. They will use `unsafe` and call it a day -- I've seen it happen more than once.

To fix this you have to change the process being used to write software -- you need to make sure people can't simply (for example) ignore C warnings or use Rust's `unsafe` at will.


Okay, I see what you mean now. I suppose that could be right.


> Kevin Weil had the two previous quotes in his context when he did his post and didn't consider the fact that readers would only see the first level, so wouldn't have Sebastien Bubek's post in mind when they read his.

No, Weil said he himself misunderstood Sellke's post[1].

Note Weil's wording (10 previously unsolved Erdos problems) vs. Sellke's wording (10 Erdos problems that were listed as open).

[1] https://x.com/kevinweil/status/1979270343941591525


Also, previous comment omitted the part that now-deleted tweet from Bubeck begins with "Science revolution via AI has officially begun...".


Eh, I wouldn't be so sure. Reading the DMCA, their code does seem to do what the law says you can't do[1]:

    "No person shall circumvent a technological measure that effectively controls access to a work protected under this title [...]"
with these definitions[2]:

    (A) to “circumvent a technological measure” means to descramble a scrambled work, to decrypt an encrypted work, or otherwise to avoid, bypass, remove, deactivate, or impair a technological measure, without the authority of the copyright owner; and

    (B) a technological measure “effectively controls access to a work” if the measure, in the ordinary course of its operation, requires the application of information, or a process or a treatment, with the authority of the copyright owner, to gain access to the work.
I think (A) pretty clearly applies: the glyphs being randomized in each request obviously counts as being "scrambled", the method used by the author with the hashes clearly descrambles them by matching the provided SVG images to the letters rendered with the book's font.

I'm less sure about (B), not being a lawyer, but I think it's so generic that it does apply: the "ordinary course of [...] operation" of reading the book requires running the apps provided by Amazon. This seems to fit "requires the application of [...] a process [...] with the authority of the copyright owner".

[1] https://www.law.cornell.edu/uscode/text/17/1201

[2] https://www.law.cornell.edu/definitions/uscode.php?width=840...


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: