Hacker Newsnew | past | comments | ask | show | jobs | submit | lambda_garden's commentslogin

Think how many game developers were able to realize their vision because Unity3D was accessible to them but raw C++ programming was not. We may see similar outcomes for other budding artists with the help of AI models. I'm quite excited!


I'm cautiously optimistic, but I also think about things like "Rebel Moon". When I was growing up, movies were constrained by their special effects budget... if some special effects "wizard" couldn't think of a way to make it look like Luke Skywalker got his hand cut off in a light saber battle, he didn't get his hand cut off in a light saber battle. Now, with CGI, the sky is the limit - what we see on screen is whatever the writer can dream up. But what we're getting is... pretty awful. It's almost as if the technical constraints actually forced the writers to focus on crafting a good story to make up for lack of special effects.


Except 'their vision' is practically homogeneous. I can't think even think of a dozen Unity games that broke the mould, and genuinely stand out, out of the many tens of thousands (?).

There's Genshin Impact, Pokemon Go, Superhot, Beat Saber, Monument Valley, Subnautica, Among Us, Rust, Cities:Skylines (maybe), Ori (maybe), COD:Mobile (maybe) and...?


> Except 'their vision' is practically homogeneous. I can't think even think of a dozen Unity games that broke the mould, and genuinely stand out, out of the many tens of thousands (?).

You could say the same about books.

Lowering the barriers of entry does mean more content will be generated and that content won't the same bar as having a middleman who was the arbiter of who gets published but at the same time, you'll likely get more hits and new developers because you getting more people swinging faster to test the market and hone their eye.

I am doubtful that there are very many people who hit a "Best Seller" 10/10 on their first try. You just used to not see it or ever be able to consume it because their audience was like 7 people at their local club.


It's now over eighteen years later after the first few games made with Unity came out and at best, being generous, there's maybe two dozen.

Which suggests even after several iterations the vast vast majority of folks are not putting out anything noteworthy.


Necropolis, Ziggurat... Imo the best games nowadays are often those that no one heard about. Popularity wasn't a good metric for a very long while. And thankfully games like "New World" and "Starfield" are helping a lot for general population to finally figure this out.


I don't agree with you at all.

Angry birds, Slender: The Eight Pages, Kerbal Space Program, Plague Inc, The Room, Rust, Tabletop Simulator, Enter the Gungeon, Totally Accurate Battle Simulator, Clone Hero, Cuphead, Escape from Tarkov, Getting Over It with Bennett Foddy, Hollow Knight, Oxygen Not Included, Among Us, RimWorld, Subnautica, Magic: The Gathering Arena, Outer Wilds, Risk of Rain 2, Subnautica: Below Zero, Superliminal, Untitled Goose Game, Fall Guys, Raft, Slime Rancher, Firewatch, PolyBridge, Mini Metro, Luckslinger, Return of the Obra Dinn, 7 Days to Die, Cult of the Lamb, Punch Club.

Many more where those came from


Some other Unity games that are fun, and which others haven't mentioned:

Cuphead

Escape Academy

Overcooked

Monster Sanctuary

Lunistice


Kerbal Space Program is another.


True, KSP definitely qualifies as breaking the mould.


Rimworld. Dyson Sphere Program. Cult of the Lamb. Escape from Tarkov. Furi. Getting over it with Bennett Foddy. Hollow Knight. Kerbal Space Program. Oxygen not included. Pillars of Eternity. Risk of Rain 2. Tyranny.

I'd say all of those do some major thing that makes them stand out.


and Outer Wilds!


True, it definitely would count, at least more so than COD:Mobile.


The Long Dark.


I await more of the story campaign with bated breath. I'm adoring it, though the last episode felt a tad rushed, or flat maybe. To me, at least.


Valheim lol


Yeah, I can definitely see how Beat Saber, Hollow Knight, and Tunic didn’t really do anything particularly creative or impressive. /s


I mentioned Beat Saber? Did you skip reading the list?


    indexes = function(data)
    query = function(indexes)
How does this model a classic banking app where you need to guarantee that transfers between accounts are atomic?


This is the bank transfer example we show in rama-demo-gallery. There's both Java and Clojure versions of that example. https://github.com/redplanetlabs/rama-demo-gallery

The atomic bank transfer is done as part of function(data). The data record contains fromAccountId, toAccountId, and amount. The function applies the transfer if it's valid (fromAccountId has at least that amount of funds), and no-ops otherwise.


Thanks for sharing this, very interesting.

It seems that you cannot, in one database call, make a transaction?

You would need to push your transfer to the depot, then poll until it has been accepted.

If you do not poll, then your transaction may be written but as a no-op.

Does this increase latency for such applications?


This example uses microbatching for the processing, so the latency will be ~200 millis. You don't need to poll and could set this up with a reactive PState query to know when the transaction is done. 200 millis is an acceptable latency for a task that needs to have strong cross-partition atomicity.

Note that depot appends can be done with "acking" so as not to return success until all colocated stream ETLs have finished processing the record. Stream ETLs also take only a handful of millis to complete. This is how you would coordinate a front-end with many tasks like you typically do with databases today (e.g. registering an account, updating a profile, adding something to a shopping cart, making a friend request, etc.).

This example uses microbatching because getting cross-partition transactionality with streaming is quite a bit harder, as streaming has either at-least once or at-most once semantics depending on how you configure it. Microbatching is always exactly-once semantics regardless of any failures that happen during processing.


An important factor is which country we are talking about. Only the US really has access to this "glitch" by being the world's reserve currency. UK austerity was so brutal because there's no real demand for GBP, outside of trading with the UK.


I met someone in Berlin who told me that he was a member of the UK Conservative party, and that he was on-course to become a Conservative party councillor before also leaving the country. He made at least four testable false statements in that discussion (I made notes, one was for how long Cameron had been prime minister), so take this with a pinch of salt:

He openly admitted that the Conservative party manifesto of 2010 was a lie. That it was totally impossible to do what they promised about the deficit. Now, I'm not sure I remembered the manifesto pledge correctly, but even with that caveat, he still insisted that lying was absolutely acceptable. He refused to accept that normalising lies in this fashion is a problem (I wish I'd asked him about the Boy Who Cried Wolf, but I didn't) and then promptly changed the topic.


This isn't true. The UK can spend as freely as the US. Being the reserve currency has nothing to do with it. Demand for the currency is unrelated to how much a sovereign state can spend.


It is related to how much the country can spend through borrowing and printing. Very few counties spend less than they raise through taxation.


I think a lot of the problem with the UK is that the housing market eats the wind out of everything else's sails too. This has both the economic effect of reducing the amount people have to spend on non-housing things and also it creates the psychological malaise of 'why should I work harder just so I can pay off some random baby boomer's mortgage instead of my own?'.

If we can fix housing we can fix a lot of what's wrong the UK I think.


> the barrier to rolling my own is lower for me than introducing a Java dependency.

Really? I never had problems using some JVM base image and deploying via a Docker image, which is what I would be doing anyway.


I've had enough problems with JVM deployments over the years to just not want the hassle in order to use something which adds marginal value. If you don't, then by all means, use it - I'm not suggesting it's bad for everyone. If it provided some massive major advantage over the large number of other options, I'd consider it too, but I don't see anything that indicates that's the case here.


Do you reject Kafka because it runs on the JVM? This is a ludicrous position to take. You may not want to fund development of projects when you're forced to train devs to work them, but to reject using technologies based on the programming language / framework is just fundamentally flawed. Don't use Python tools because I hate python. Don't use kubernetes because it's golang don't use AWS because it has a bunch of java services. S3? I'm not entirely sure but I wouldn't be surprised if it was java edge service based as well.


Note this line:

> If it provided some massive major advantage over the large number of other options, I'd consider it

People reject using technologies because of the impact of having to support them all the time when the benefit does not justify needing to support one more technology. I don't have an issue with using JVM based services if I can have someone else manage them for me, but when I have to deal with the operational issues myself, or within my team, then how comfortable we are with managing a specific technology vs. the advantages it brings over competing alternatives absolutely is part of the consideration, and it'd be irresponsible for it not to be.

In this case the benefit seems marginal at best, and I'm not deploying a JVM based service for marginal benefits given how much hassle they've brought me over the years.


> Do you reject Kafka because it runs on the JVM?

Nothing is outright rejected because it runs on the JVM, but it is a detracting quality, adding complexity. If Kafka is clearly the best solution available for your problem, you'll suck it up and the use JVM, but if there is another approach that will also serve your needs without a JVM dependence then the choice is easy.


I need to use Java exactly zero times when interacting with those, they have dedicated clients for everything, something this stuff apparently is proud of not having.


So now you are dependant on JVM and Docker


> Have you tried to generate a SHA256 checksum for a file in the browser

Have you tried to do this in Python?

A Node comparison would be more appropriate.


This is not the gotcha you think it is.


Wow that's an interesting thought.

If every did their neighbor's laundry (say) and charged a fee, then everyone would be doing the same amount of laundry and have the same amount of money, yet GDP would rise?


Yes, there is also this nice story to illustrate this:

Two economists are walking in a forest when they come across a pile of shit.

The first economist says to the other “I’ll pay you $100 to eat that pile of shit.” The second economist takes the $100 and eats the pile of shit.

They continue walking until they come across a second pile of shit. The second economist turns to the first and says “I’ll pay you $100 to eat that pile of shit.” The first economist takes the $100 and eats a pile of shit.

Walking a little more, the first economist looks at the second and says, "You know, I gave you $100 to eat shit, then you gave me back the same $100 to eat shit. I can't help but feel like we both just ate shit for nothing."

"That's not true", responded the second economist. "We increased the GDP by $200!"

copied from: https://news.ycombinator.com/item?id=37395566 (but they probably also copied it from somewhere)


Yes, GDP is an unreliable measure for value created.

It's not just that some work isn't counted. It's also that work is counted based on monetary remuneration, even if it's contribution to the general welfare is very low or negative.


Use the spread syntax to make a shallow copy; it's more performant.


I've never had the need to clone anything in a real JS code-base, personally.

What's the use-case?


It is very frequently needed when you're working with a component framework like React or Vue. Typically leaf components shouldn't mutate properties directly but rather emit a changed version of the original data they receive.

But it's not necessarily related to frameworks; if you're working with complex enough data structures and you're following a functional approach, you'll need to do similar things sooner than later.


I use React.

Why does that require deep clones?

I simply do:

    return {
      ...current
      foo: "bar"
    };


I'm not sure if you're asking why deep copies are useful or something else.

Maybe you're handing over your data to a library developed by someone else and you want to make sure the library cannot mutate your original data, so you opt to pass a deep copy instead. Or maybe you are the author of said library and you want to make sure you preserve the original data so you copy it on input rather than on output.

There are many situations where deep-copying is useful but I agree that you should use the simplest pattern that works for your use-case.


if "current" is a deep object here and contains other objects/arrays, you risk that wherever you are sending this shallow copy will mutate those deeper values and potentially mess things up for the code where "current" came from.

Maybe it's not a situation that comes up often, but it would be fairly hard to debug and guarding yourself against mysterious problems in advance is always neat.


In practice, I think it's easier to use linting and type-systems to prevent other code from mutating you stuff than defensive copying.


You could make the same argument backwards though - many people may find it easier to do deep copies rather than throwing extra software they might not necessarily be familiar with.


When the caller passes you a deep structure and you want to ensure they don't mutate it afterwards. But I agree, it's seldom needed in application code.


I came across a bug recently where a data structure from Redux, which is immutable, was passed to a GraphQL library that would modify the data it was passed. So we had to make a deep clone.


Every time you get an object from somewhere, which needs to be preserved and modified at the same time.


The docs are really poor. It makes me less confident that I have configured everything correctly.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: