Hacker Newsnew | past | comments | ask | show | jobs | submit | s1dev's commentslogin

When maintaining a quantum memory, you measure parity checks of the quantum error correcting code. These parity checks don't contain any information about the logical state, just (partial) information about the error, so the logical quantum information remains coherent through the process (i.e. the logical part of the state is not collapsed).

These measurements are classical data, and a computation is required in order to infer the most likely error that led to the measured syndrome. This process is known as decoding.

This work is a model that acts as a decoding algorithm for a very common quantum code -- the surface code. The surface code is somewhat like the quantum analog of a repetition code in a sense.


I would instead give the example of the Hamming code. As you probably know, you can construct a quantum code, the Steane code, which is just analogous to Hamming code.

The Steane code is the simplest triangular color code. i.e. you can arrange all the qubits on a 2D triangular lattice, and only do nearest neighbor interactions [1]. The surface code is a similar quantum code, in which the qubits can also be placed on a 2D lattice, except that lattice is made up of squares.

Why do we care about 2D surfaces and nearest neighbor interactions. Because it makes building quantum hardware easier.

EDIT:

[1] The Steane code's picture is shown here. https://errorcorrectionzoo.org/c/steane Seven data qubits are on the vertices of the triangles. 2 syndrome qubits on each of the faces.


I often wondered why MIMO was such an investigated topic. It would make sense if the Shannon limit is higher for this channel. Is there a foundational paper or review that shows this?


I believe that a classical radio receiver is measuring a coherent state. This is a much lower level notion than people normally think about in QEC since the physical DoF are usually already fixed (and assumed to be a qubit!) in QEC. The closest analogue might be different choices of qubit encodings in a bosonic code.

In general, I'm not sure that the classical information theory toolkit allows us to compare a coherent state with some average occupation number N to say, M (not necessarily coherent) states with average occupation number N' such that N' * M = N. For example, you could use a state that is definitely not "classical" / a coherent state or you could use photon number resolving measurements.

A tangential remark: The classical information theory field uses this notion of "energy per bit" to be able to compare more universally between information transmission schemes. So they would ask something like "How many bits can I transmit with X bandwidth and Y transmission power?"


Are Q-switched / pulsed lasers common in industrial applications?


Answering generally I would say yes, they are one of the common types of lasers that could be chosen depending on the application


What bank is this and are they available nationwide?


First Tech CU. Their physical locations are PNW only, but that hasn't stopped me from continuing to use them electronically on the east coast. They are also part of the CU alliance, so access to alliance branches and ATMs is possible (I've never had the need to test this).


Yes. Why are banks with TOTP so rare?!


I have no idea, and I despise it. USAA and eTrade both have TOTP, exclusively with the shitty, non-backup-able Symantec VIP app. Break your phone? You're boned! Symantec VIP on those sites don't provide 2FA verification (the thing where the phone asks to confirm the number on the client-side) and it doesn't provide push notifications.

It's literally a worse version of regular TOTP. And they're in the minority even having 2FA!



This works for Charles Schwab too!


and fidelity!


What could possibly go wrong using an open source project for authentication against your bank accounts? Where have we seen this before?

You best audit the shit out of that code if you actually use it. Every. time. they. update.


For a circuit of size C, the size of a fault tolerant circuit to compute the same thing is O(C polylog C)

https://arxiv.org/abs/quant-ph/9906129


Technically correct is the best kind of correct.


I’m generally in agreement with the anti-bash camp, but I can name about that many :)

- Mutating default arguments to functions, so subsequent calls have different behavior

- Somewhat particular rules around creating references vs copies

- Things that look like lambda captures but aren’t quite


Yep, every language has footguns and other kinds of quirks, but I contend that the "footguns per character" ratio in shell is unusually high. (It is not unique in having a high ratio though; other popular languages, like c++ for instance, also have this issue.)


The worst (level of nastyness * usage) offenders all probably have a reason for being popular despite their flaws:

- Bash: installed everywhere you want to work (yes, who actually wants to work on Windows ;-)

- C/C++: when speed/size matters there was no alternative except Assembly until recently

- Javascript: until recently this was the most sane option for client side code on the web (Active X and Java applets existed yes but managed to be even worse.)

- PHP: Low cost hosting, Function-As-A-Service way before that became popular, shared nothing architecture, instant reload for local development bliss


Couldn't agree more.


- string vs. list "in" ('a' in 'a' is True, but 'a' in ['a'] is also True)

- cannot know which object attributes are private or public (and some classes use settable properties so you can't say "just don't set any attributes on non-dataclass objects")


Touche!


A few things come to mind from more modern aircraft like the 787/A350: Fly by wire (flight envelope protection), electronically actuated control surfaces (less hydraulics / reduced complexity), bleedless engines (greater efficiency), greater use of composites (less weight), modern wing design using computational modeling/optimization (possibly more efficient), and essentially as large of a turbofan fan as you would like


I want to point out that the experiment was at Harvard in the Lukin group. There is a proposal for constant-rate encodings using large quantum low-density parity check codes via atom rearrangement which could in principle achieve such high encoding rate. That said, it's certainly not mainstream yet. https://arxiv.org/abs/2308.08648


Yes, good point (apologies to the Lukin group). That's an interesting proposal, but it seems from a cursory read that you would need still need very many physical qubits to approach that asymptotic rate, and also you would be forced to take a very large slow down due to serializing all of your logical operations through a smaller set of conventionally encoded logical qubits. That said, I'm not current on SOA LDPC QEC proposals, so I'll moderate my claim a bit to "the first actually useful logical qubits will almost certainly have an encoding rate lower than 1/5".


"Peanut Gallery" here... These types of conversations are the reason I'm still addicted to HN.

Thank you both. And /hattip


This is pretty neat! I think it's worth doing more analysis on the entropy cells separately from the entropy extractor. For example, the von Neumann extractor requires exchangeability of the input bits. Does this hold? In the setting where there is no phase noise from the entropy cells (and hence no entropy), does the TRNG give any output? Assuming some basic model (maybe measure this?) about gate delays, how much entropy should be generated by each entropy cell?


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: