Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
A new, faster algorithm for the group isomorphism problem (quantamagazine.org)
110 points by theafh on June 23, 2023 | hide | past | favorite | 18 comments


The "major algorithmic goal" is better time complexity for group isomorphism.

Xiaorui Sun improved on Robert Tarjan's (50-year-old) result,

  n^(log n)
achieving

  n^((log n)^(5/6))
for certain types of groups that appear to be easier to compare but had been defying improvement attempts for decades.

---

Anyone see intuitively how the 5 and 6 come into the improved complexity -- why those two particular constants?


Strassen's algorithm is, I think, one of the easier ones to understand along these lines. It uses a recursive divide and conquer strategy. The naive algorithm does 1 (N x N) multiplication by adding the results of 8 multiplications of size (N/2 x N/2), and (sweeping details under the rug here) that takes time N^log2(8) = N^3. The improved algorithm uses 7 multiplications of size (N/2 x N/2) and takes N^log2(7). For those details, the wiki page does a nice job.

https://en.m.wikipedia.org/wiki/Strassen_algorithm

So intuitively speaking: when I see a logarithm like that with an integer ratio, I think "oh, somebody saved 1/6th of the work in a recursive algorithm."


This is a power of a logarithm though? Whereas you're talking about the base and its argument.


No, Strassen's isn't an exact match in this case. Just an example how strange exponents can crop up in complexity analysis. I was aiming to satisfy the original comment's ask for "intuition" rather than "read the entire paper and digest it in time to make an unassailable ELI5 while waiting for the bus."


This isn't just "an exponent" though, it's a double exponent on a logarithm that is in the first exponent. It sure looks like a radically different beast than the recursion or the Master theorem. The mere presence of strange numbers in some exponent really doesn't say anything - I doubt anyone derives any intuition from the Strassen case, since as I'm sure you can imagine, algorithms that take n^(5/6) time, vs. (log n)^(5/6) time, vs. n^(log (5/6)) time, vs. n^((log n)^(5/6)) time... really need not have any common intuition behind them whatsoever.


Really leaning into that "somebody's wrong on the internet" thing, aren't you. If you have a better example that has exactly the right runtime and admits a simple description, feel free to share with the class.


I can't say for "intuitive", but the introduction to the paper involves breaking p-groups — groups having p^k members where p is a prime — into the cases k > lg(p)^5 and k <= lg(p)^5, which likely explains why the number 5 has crashed the party. (Here lg() denotes the log base 2)


I've wished for a long time that mathematicians would bring back the notation "ld" (logarithmus dualis), which I find very elegant, for log₂. The "ln" is already from Latin (that's why it's ln instead of nl), I think established by Gauss or something.



But ld should denote the decimal logarithm. Binary logarithm should be denoted with lb.


I believe this is the punchline but I still can't figure out what the 6 is about:

    To obtain Theorem 1.1, let k denote logp(n). We apply Theorem 1.2 for the case of k > (log2(p))^5
    by constructing the skew-symmetric matrix spaces for both input groups according to the Baer’s
    correspondence [Bae38]. Theorem 1.2 implies the running time for this case is n^O((log n)5/6). For the
    case of k ≤ (log2(p))^5 , we run the aforementioned generating set enumeration algorithm [Mil78].
    Because every p group of order p^k has a generating set of size at most k, the running time of the
    algorithm for this case is p^O(k2), which is also n^O((log n)5/6).
pasting dropped some superscripts to regular scripts. I tried to fix them but I may have borked a couple.


Scroll scroll scroll and...

https://arxiv.org/abs/2303.15412


> Scroll scroll scroll and...

The new result (with explicit link to the arXiv as well as the author's home page) is linked in the fourth paragraph, and it only appears that far down because the first three paragraphs very efficiently provide background on the problem and recent results. The whole thing is an excellent general-audiences article explaining a complex theoretical result with illustrations and accessible links to all the relevant source material. I'm really glad we have Quanta and I'm not sure how this reporting could have been handled better.


Exactly.

EDIT: I was going to write a snark-ish comment that someone would eventually complain about the post title only to refresh a second later and see that someone changed the title already.

“Major algorithmic goal” was completely fine and says quite a few different things than “new, faster”.

Also, according to HN’s guidelines, the post title shouldn’t have changed in this situation:

“Otherwise please use the original title, unless it is misleading or linkbait; don't editorialize.”

The original title was not misleading or linkbait.


"Sun's method takes an approach called individualization and refinement..."

Which is what many standard methods for graph isomorphism use, if we are talking about the same thing.


Could this be used for computer program equivalence?


Don't think so, doesn't computer program equivalence require solving the halting problem and undecidable problems?

For example, consider the empty program, and the program that print "hello, world!" if an undecidable condition is met. I think checking if these two programs are equivalent is undecidable


For these two programs it's easy to check that they are not equivalent.

The magic words are "in general" and in general, program equivalence, as well as all other interesting program properties, are in general undecidable (Rice's Theorem).




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: