Hacker Newsnew | past | comments | ask | show | jobs | submit | beardbound's commentslogin

I mostly agree with you, but if a human straight up copies work under copyright they’re violating the law. Seems like a LLM should be held to the same standard unless they should be even less beholden to the law than people.

It’s also incredibly hard to tell if a LLM copied something since you can’t ask it in court and it probably can’t even tell you if it did.


From what I have seen, the (US) courts seem to make a distinction between 100% machine-automated output with no manual prompting at all, versus a human giving it specific instructions on what to generate. (And yes I realize everything a computer does requires prior instruction of some kind.)

But the issue with copyright I think comes from the distribution of a (potentially derivative or transformative in the legal sense) work, which I would say is typically done manually by a human to some extent, so I think they would be on the hook for any potential violations in that case, possibly even if they cannot actually produce sources themselves since it was LLM-generated.

But the legal test always seems to come back to what I said before, simply "how much was copied, and how obvious is it?" which is going to be up to the subjective interpretation of each judge of every case.


Mother night was my favorite Vonnegut book.


I don't have much of an opinion on React as a library, although they do use it for our web frontend at my work. However something I noticed in the past when I was doing some web dev work out of AUS was that some websites would have incredibly buggy behavior. Keep in mind this was around 10 years ago so some things might have changed. After doing some debugging it turned out that some of the libraries were fairly large, and they didn't have co-located CDN's in the area. The specific ones in that were jQuery. That caused some of the JS libraries to not fully download, or just some of them to time out.

That's a mostly fixable problem with having something like a pipeline that minifies the libraries and has them being downloaded from somewhere closer to your users, or putting them on a better CDN, but a lot of people don't have the time/option/money to do that. It was pretty obvious in that instance since a lot of that was wordpress development at the time and things like plugins/extensions usually don't bother. When I was doing similar development in the US I never saw any of that kind of behavior since so many libraries at the time were hosted on the West Coast where I lived. Also most developers I know aren't super resource limited on their hardware (desktop/laptop or mobile), which might be a factor.

I'm not sure if this is what people are referring to, but just something to consider.


That’s a good point. I guess I am privileged to live in a well connected area in the states.


Sure, but it's also paved with bad intentions, and neutral intentions. I would say that intentions have very little effect on the overall outcome of actions in general. Also good and bad are relative.

I would say that the road to outcomes are paved with actions. Not as pithy as the original though.


You can set them off in accessibility settings https://support.apple.com/guide/iphone/reduce-onscreen-motio.... I agree it's not something they make easy though. A lot of cool features/customization like that is in accessibility settings.


On a side note, they (Google included) really should drop the "accessibility" naming.

I don't know how much it made sense when it was to allow blind UI navigation (could have been a "navigation" or "interactions" setting).

But nowadays with the amount of stuff in there, including permissions for password managers, animation reduction, audio EQ etc., it just makes less and less sense.


proxmox is using KVM for virtualization and Linux Containers (LXC) for the containers. I agree that something like terraform and/or ansible would make more sense for an IAC (infrastructure as code) deployment. Most of the people I talk to that use proxmox for a homelab prefer to do things manually and don't bother with any IAC implementation.

For work I'm a firm believer in reproducible environments and IAC. We actually a combination of vagrant, libvirt, and KVM to spin up local clusters for quick testing and development. It works out pretty well, but in my homelab I don't have anything complicated enough to bother setting up terraform/ansible for. Although I imagine if my server crashed I probably wouldn't think that way anymore.


Given enough appetite and love from me, I've got a pretty robust Ansible script for building a Proxmox host with $i++ LxC's.

It's not suitable for open sourcing yet (embedded secrets and the like), but if the community wants it, it's pretty solid.

Only issue I see is the Ansible script currently always expects to be building a cluster of Proxmox hosts. I'd need to make some change to customise it so it can build out just one node though.

I've been using it or ~3 years now for my Proxmox cluster home lab which predominantly hosts LLAMA, *Arr stack, deluge, Nginx, Tailscale and a few other services.

It's not quite a one click deployment, but it can build our an entire cluster in 30 minutes after an initial Proxmox install is completed.


Proxmox makes that all into a point and click appliance to focus on reliability and doing something with the technology.

Just because someone doesn't use vagrant, libvirt and KVM to spin up local clusters manually, doesn't mean they don't know how.

There is no shortage of Proxmox users who grew up in datacentres from bare metal servers, to virtualization first coming out, and beyond.


Why is something like Proxmox a bad target for IAC?


If you want to manage VMs, then you're probably using terraform + provider. However, SDN (Software Defined Networking) is not yet supported [1], which makes any kind of deployment with network separation not feasible (using IAC only).

[1] https://github.com/bpg/terraform-provider-proxmox/issues/817


I've seen this more and more in recent memory. I feel like the standard used to be define an acronym the first time you use it, then you can use it without explanation.

Such as the TLA (three letter acronym) used here.


That's an interesting thought. I've done quite a bit of work in the medical/healthcare space in tech in the past. I think part of your issue is that doctors are notoriously hard to get a hold of. Also they don't make any of the decisions for tech.

If you're just generally trying to get an idea of pain points in the industry I think you should get more specific. Here are some for instances:

- Are you trying to make the interface between the doctor and the patient easier?

- Are you going to look at the way the doctors interact with the EHR (electronic health records) vendors?

- Are you trying to make the check-in process easier for patients?

- Are you trying to provide analytics for long term quality of care metrics on hospitals?

- Are you targeting small clinics, hospitals, or ER/minor emergency rooms?

Also for cold calling doctors on Linkedin, I am not surprised that you aren't getting much feedback. Most of the doctors I've known tend not to be into online stuff. You might have more luck reaching out to a local professional organization.

Also you might think about trying to contact nurses, nurse practitioners, or physician's assistants. There might be a local nurses professional organization that you could reach out to who could ask their members if anybody is interested. Just frame it as you're trying to figure out a way to make their jobs easier.

This ended up being a little rambly, but I hope it's useful. Cheers.


Thank you. YES on a lot of your bullet points. But I'm very open-minded about it and would work on any issue if enough people raise it.

I've been thinking of reaching out to nurses and other pros that work around docs.

I like the professional org idea.


For what it's worth it does require a meta account, but not a Facebook one. I refused to buy one while it required a Facebook account since I deleted a couple of years ago. Once they made the change I figured that was an OK compromise. I just found out today during the outage that my headset won't work if I get signed out of my Meta account. That was an unpleasant realization, although I suppose it's partially my fault for trusting Meta not to hamstring the hardware their selling.

It's the equivalent of finding out that if Microsoft's auth servers go down no one with a Windows PC can use it since they can't authenticate. I'm fairly displeased.


I agree that people value different things but I wouldn’t say information about brand reliability is widely available. I have a consumer reports membership for checking that kind of thing simply because all of the info out there is mostly blogspam and fake review sites.

Information from reputable sources on large appliance purchases is incredibly hard to find since most people don’t buy appliances often, so even consumers used to doing online research will be hard pressed to know which sites to trust. I know that I wouldn’t have a clue.

If there is a place besides consumer reports, which is a paid service, where you can. Heck these things I would be very interested in hearing about it.


how would a site like this even get paid? from appliance company commissions?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: