Hacker Newsnew | past | comments | ask | show | jobs | submit | twtw's commentslogin

Can something be obsolete if it is still extremely widely used?

IIRC, Linux and FreeBSD both use a buddy allocator for physical memory, and I'm pretty sure it is used (in addition to other allocators) in jemalloc as well.


I don't think jemalloc uses any buddy allocators. It's a segregated free list allocator, like most C mallocs and tenured generation allocators.


But they neglected maintenance when serious issues with the plane were apparent. The pilots of the flights prior to 610 noted that the two AoA sensor readings differed by ~20 degrees, and nearly aborted their flight before overriding the auto trim.

Flight 610 should never have taken off.


That's a misstatement of what occurred. The aircraft had a fault on the previous two flights and was repaired before the fatal flight.

> Flight 610 should never have taken off.

A repaired aircraft should not take off?


The preliminary report from the investigation of LNI610 disagrees with you: https://reports.aviation-safety.net/2018/20181029-0_B38M_PK-...

The AoA attack sensor was replaced prior to flight LNI043 (he flight prior to 610), in which the pilots declared pan-pan and had to manually override the auto trim systems (which include MCAS). Between 043 and 610, there was further maintenance that included flushing the pilot valves and cleaning electrical connectors, but the AoA sensor was not replaced between 043 and 610.

An aircraft that was repaired followed by one flight in which the pilots declared abnormal operation and performed 3 non-normal checklists should probably not fly again until they figured out what happened, and the next flight should definitely be aware of the incident on the flight prior. 043 faced and overcame the exact same malfunction as 610 - to me that indicates a clear failure. If you keep flying an aircraft that malfunctions on each flight, you're pressing your luck.


> Between 043 and 610, there was further maintenance that included flushing the pilot valves and cleaning electrical connectors, but the AoA sensor was not replaced between 043 and 610.

They followed Boeing's procedure in attempting to address the issue again (after already replacing the sensor). That's a completely appropriate action.

> the next flight should definitely be aware of the incident on the flight prior.

It is in the log book. Is this a reference to something particular?

> 043 faced and overcame the exact same malfunction as 610 - to me that indicates a clear failure.

I agree, that's why maintenance conducted a repair.

> If you keep flying an aircraft that malfunctions on each flight, you're pressing your luck.

They attempted two repairs in accordance to the manufacturer's procedures within that time in order to resolve the malfunctions.

This post references the report and you clearly read it enough to know your original conclusion was mistaken, but are still arguing you're right by trying to shift the discussion. You've gone from "they flew a broken aircraft" to "alight, so they repaired it twice, but you cannot ever trust a malfunctioning aircraft again." By that logic every single commercial aircraft would be in the junkyard... Repairing malfunctioning systems is normal, attempting two different repairs isn't uncommon either.


It's not fixed until it is fixed.

I understand that with physical systems, the cost and complexity of detecting is higher, but if you're putting other people on the line, you damn well better do a live test of your vehicle before another living soul not certified as a pilot or flight engineer is allowed on board.

The more I read about this, the more it appears to me that excessive trust is placed on filed paperwork. Nothing says a fix is done like a successful test flight that specifically attempts to recreate the conditions surrounding the original failure.


I don't disagree, but now we're holding Lion Air to a much higher standard than any other commercial airline, including US and EU based ones.

Certain repairs are tested. Non-safety critical sensors like AOA are not, because you're meant to be able to land if there's a malfunction. Obviously not in this case, which points to procedural problems beyond any one airline.


The AoA sensor stopped being non-safety critical when it's output was allowed to drive a system capable of endangering the aircraft.

I don't believe Lion Air is directly responsible; they didn't test flight it, but they didn't know they should either due to Boeing's poor communication of the functionality and justification for MCAS.

It's a grievous failure all around, and yet another reason I stand by the belief that if there is doubt, there is no doubt.


> You've gone from "they flew a broken aircraft" to "alight, so they repaired it twice, but you cannot ever trust a malfunctioning aircraft again."

If you think I've changed my argument, you misunderstood my comments. I still think that they flew a broken aircraft.

I'll stand by my statement that 610 should never have taken off, and I'm somewhat surprised that it is contentious. I don't care if all the protocols were followed and the logs made (though obviously the pilots of 610 didn't understand what the previous pilots had done to respond to their incident) - in retrospect, we know that the aircraft was not airworthy going into flight 610, so something needs to change so that next time that is detected before takeoff. Whether that is better observation of the protocols or better protocols, I'm not sure.


Does repaired in this context also include a successful in-flight test?

At some point, the complexity of system integrations requires that you do something more than bare minimum component retesting.


This is a seriously misleading characterization of the 737 Max. If you read https://theaircurrent.com/aviation-safety/what-is-the-boeing... (the source of the article that 'acqq is selectively quoting) you will learn the following:

> MCAS is “activated without pilot input” and “commands nose down stabilizer to enhance pitch characteristics during step turns with elevated load factors and during flaps up flight at airspeeds approaching stall.

> Since it operates in situations where the aircraft is under relatively high g load and near stall, a pilot should never see the operation of MCAS.

In other words, the aircraft does not need MCAS for stability during flight in anything but exceptional circumstances. Normal flights should never have MCAS active.

This is not remotely similar to the aerodynamics of a fighter jet.

MCAS is a software mechanism designed to prevent stalls in extreme circumstances that can be overridden with the same method used to disable auto trim on previous models of the 737. The A320 has software to prevent phugiod motion in exceptional scenarios, except the A320's mechanism cannot be overridden by the pilot.

I don't care if you choose to never fly on a plane again, but don't spread this misinformed hysteria.


[flagged]


Crossing into personal attack will get you banned here. Please review https://news.ycombinator.com/newsguidelines.html and don't do that again.

Accusing someone else of astroturfing or shillage without evidence is particularly out of line.

https://hn.algolia.com/?sort=byDate&dateRange=all&type=comme...


The comment you are responding to addresses the claim that the 737 max is not passively stable. It is less safe than other aircraft, but not because it requires active control for stable flight in any manner comparable to a fighter jet.

I'll direct you to some portions of my other comments that you seem to have overlooked:

> airlines and pilots were not informed well enough about MCAS

> previous auto trim systems could also be "out muscled" just by pulling back on the yoke without actually switching on the manual override.

----

> polluting this thread with lies

I'm pretty sure everything I've posted on this thread is either true or obviously an opinion. If some things are not true, please point them out specifically so that I can correct them.

Nowhere have I said that the lion air 610 accident only happened because of poor maintenance or pilots actions, but it is undeniably true that with different pilot actions and aircraft maintenance the accident would not have occurred.


No, it doesn't. The 737 max has software to prevent stalls in situations outside of the profile of normal flight. As does e.g. the A320.


Yep Apparently the main problem seems to be the fact there is a single point of failure (a single AoA sensor) and it could be resolved by having multiple sensors. Not that some sort of new automation is occurring than before.


The 737 max has multiple AoA sensors.

Having multiple sensors doesn't always help, as failures are correlated. For example, there was an Airbus A321 that had 2 of its 3 sensors get stuck in the same position due to icing.[1] The computer thought that the one working sensor was malfunctioning and disregarded it. It then engaged stall protection, dipping the nose and causing the plane to dive at 4,000ft per minute.

> The captain continued to hold “more than 50%” rearward stick in stable flight for a period, but with help from technicians on the ground, the crew was able to reconfigure the automation into the aircraft’s alternate control law, rather than its normal “direct” law. The action removed the alpha-protection checks and canceled the nose-down input. The aircraft then continued to its destination.

These failure modes tend to happen more with Airbus planes because their flight control systems default to overriding human input if the computer deems it unsafe. Boeing's flight computers will give more resistance in the controls, but they mostly won't prevent the pilot from doing what they want to do. The exceptions (such as auto-trim and stall prevention) can be disabled with by flipping a couple of switches.

1. https://aviationweek.com/commercial-aviation/german-investig...


True but 2/3 failing instead of 1/1 is still the better option. I’ve read the Max only used one sensor for the MCAS system.

But I’m not an expert on the subject so I’ll defer to the reports and more knowedgable people.


I recall reading somewhere that while the 737Max has two AoA sensors, only one of them is connected to the MCAS.


"The automatic trim we described last week has a name, MCAS, or Maneuvering Characteristics Automation System. It’s unique to the MAX because the 737 MAX no longer has the docile pitch characteristics of the 737NG at high Angles Of Attack (AOA).

This is caused by the larger engine nacelles covering the higher bypass LEAP-1B engines."

Boeing only told pilots MCAS existed following the Lion crash, almost 18 months after the plane entered service

https://leehamnews.com/2018/11/14/boeings-automatic-trim-for...


It's interesting that you reach that conclusion.

I'll fly 737 max.

I won't fly Lion Air or Ethiopian Airlines.

Southwest: fleet size 754, founded 1967, total of seven accidents with 3 deaths.

Ethiopian Airlines: fleet size 107, 64 accidents with 459 deaths since 1965.


I don't have data on flight miles or flight segments by type of aircraft, so let's do a quick and dirty estimate.

Let's say there are 40,000 commercial aircraft worldwide and 350 of these are the Boeing 737 Max [0].

If all aircraft are equally likely to crash, then the probability of a given crash being a 737 Max is 350/40000 = 0.00875.

The probability that two crashes are both 737 Max is 0.00875 * 0.00875 = 0.0000766

It's extremely unlikely that an aircraft representing less than 1% of the global fleet would crash twice in a short period of time unless there is a serious defect with that aircraft.

That's well past probable cause at this point. This aircraft should be grounded until they figure this out.

[0] https://en.wikipedia.org/wiki/List_of_Boeing_737_MAX_orders_...


> The probability that two crashes are both 737 Max is 0.00875 * 0.00875 = 0.0000766

You need to multiply this by every combination of two crashes in the observed time period with starting and endpoints not cherry picked to include 737 max crashes though.


Be my guest. You'll find that the probability of a rare aircraft crashing twice in a short period of time is infinitesimal, unless that aircraft contributed to the catastrophe.


It's hardly infinitesmal, that's my point.

With an average of 175 737's operating since launch, 4K total widebody commercial aircraft and over a dozen widebody crashes in that period, you get over 10%. Some estimates there but you still have the endpoint issue as well.


There have been a total of 5 commercial air disasters in 2018 and 2019 with fatalities that didn't involve hijacking, landing short of the runway or overshooting the runway [0].

And that includes the crash of a cargo flight with no passengers where the crew was killed.

The 737 Max 8 was involved in 2 of those 5 disasters.

There are between 25,000 and 39,000 commercial aircraft in service depending on who you ask [1].

With 350 737 Maxes delivered so far, that's at most 1.4% of the total today, probably less than half that this time last year. Let's call it 1% on average for 2018 and 2019 combined.

There are 10 ways you can have 2 Max crashes out of 5 total crashes. So the probability is 10 * 0.01^2 * 0.99^2 = 0.00098

Even accounting for n choose k and longer endpoints, it's still an infinitesimal probability that we'd see two catastrophes with the same rare aircraft -- unless that aircraft contributed to the catastrophe.

It should be grounded.

[0] https://en.wikipedia.org/wiki/List_of_accidents_and_incident...

[1] https://www.telegraph.co.uk/travel/travel-truths/how-many-pl...



It is possible for both a carrier to be more at-risk for crashes as well as a plane to be more at-risk for crashes.

The FAA issued an emergency operation directive for the 737 Max due to inaccurate sensors which could lead to a crash. I'm not entirely sure why you're ignoring that fact and deflecting to something unrelated.


> deflecting to something unrelated

I don't think it is unrelated that the only accident thus far of a plane with a failure mode that is more difficult for pilots to respond to was on an airline with a notoriously bad safety record.

I fully recognize the fact that airlines and pilots were not informed well enough about MCAS, but I'm also not ignoring the other circumstances of lion air 610

Safety in depth - a single failure should not be a problem. 610 crashed as a result of MCAS, but also a number of other operational failures of Lion Air and the pilots.


Downvoted. Your data is terrible and should be cited. The 737Max has only been flying for 2-3 years. Southwest is irrelevant (I say as I sit on a SWA flight on the tarmac in SAN waiting for a replacement crew member). I made damn sure it wasn’t a Max (which I flew on Aeromexico twice back in September).

I have no idea where your Ethiopian stats come from because I’ve heard nothing but great things about them. Feel free to rebut.

“Or how about Ethiopian Airlines? Here is another impoverished country surrounded by rugged terrain. Yet the record of its national carrier — three fatal events, one of them a hijacking, in over seventy years of operation — is exceptional. Ethiopian is one of the proudest and arguably one of the safest airlines in the world.”

http://www.askthepilot.com/questionanswers/foreign-airline-s...


Unable to delete, as I was able to find stats last night that confirm Parent's Ethiopian Air stats, and I regret the tone.

Regardless, Ethiopian Air is considered a top tier safe carrier.


I think accidents per trip (flight leg) is a more useful metric than accidents per year. If you're interested in mechanical issues then you'd also need to remove terrorism related deaths.

Yes, it's a small sample size, but if you look at the fatal accident rate per flight leg of a 737 MAX 8, I'm sure it's much higher rate than any other modern aircraft.


According to https://www.tripsavvy.com/the-safest-aircraft-54428, the following modern aircraft have fatality-free records:

Boeing 717 (formerly the MD95)

Bombardier CRJ700/900/1000 regional jet family

Airbus A380

Boeing 787

Boeing 747-8

Airbus A350

Airbus A340


This list is meaningless without accounting for the number of aircraft & the number of miles they have flown. For example, the 737 is not only the world's most popular commercial airliner, it's simultaneously the "most dangerous plane" because it has had 145 accidents [1] and one of the safest planes because the 737 NG variant has only had one crash in 16,047,900 flight hours. [2]

[1] https://www.airfleets.net/crash/stat_plane.htm

[2] http://www.travelvivi.com/the-safest-aircrafts-in-the-world/

And indeed, the 787, 747-8, A350 and arguably the A380 are all new planes that have not accumulated the decades of flight history with huge fleets that the 737 has.


True, but the 737 MAX 8's first commercial flight was in 2017, much newer than any of the aircraft on the above list. The point is that such a new aircraft already has a far worse safety record than plenty of aircraft that have been around for years. I think it's irrelevant to compare 737 subtypes, since discussion focuses on the 737 MAX 8 variant specifically.

> And indeed, the 787, 747-8, A350 and arguably the A380 are all new planes that have not accumulated the decades of flight history with huge fleets that the 737 has.

The 787's first commercial flight was in 2011. The 747-8 in 2012. A350: 2015. A380: 2007. All predate the 737 MAX 8 by years, and have better safety records.


You have to know that the problematic Anti-Stall feature was just recently added in the MAX rev 8. The older rev 1-7 don't have it. It might have other stall problems due to being backheavy, but the traditional stall warning should have been good enough. There was no accident in rev 1-7, none with the improved Southwest configuration (with two sensors) and already two complete losses with the updated MCAS rev 8. They already did two more updates on this (9 and 10), but still not safe enough for modern safety standards.


Sure, but for comparison, the original 737 was launched in 1967 (!) and even the 737 NG has been flying since 1997.

So while having two MAXes crash after takeoff mere months apart is indeed statistically unlikely, I wouldn't necessarily leap to the conclusion that the two accidents are related just yet.


The Concorde used to be #1 on this list, while being the least safe commercial jetliner of all time.


Southwest also took steps to add additional AoA sensor information displays to their 737 Max aircraft in the wake of the Lion Air crash. The operator does make a difference.

https://theaircurrent.com/aviation-safety/southwest-airlines...


If Boeing is issuing software updates that disable manual override without telling pilots, Boeing is completely at fault here, and some of the engineers/executives should face criminal charges.


I am confused. When I read the article about the Lion Air crash, I thought one cause of the crash was that the pilots hadn't been properly trained in the manual override procedure that can deal with the problem with malfunctioning sensors.

I don't understand how a software update can override a completely manual override.


The procedure that pilots are taught to override auto trim was unchanged, but I think that previous auto trim systems could also be "out muscled" just by pulling back on the yoke without actually switching on the manual override. It's that second part that people are pointing out (though I will in turn point out that the pilots on the flight prior to 610 faced similar issued and used the manual override to respond to them).


> criminal charges

Boeing informed the FAA of the changes, and the FAA decided that the pilots and airlines did not need to be informed. Criminal charges for Boeing would be extremely surprising.


Would the FAA decision have any bearing on what happens in Ethiopian airspace?


Taking your question at face-value: FAA guidelines apply to all airlines that fly into the US - Ethiopian does.


Good point. Boeing should be have told every customer that bought a plane.


But Southwest did something nobody else did. They added a second AOA sensor. This should mandatory by every security standard and Southwest did good by mandating it, but Boeing still got away with all other companies.


Good luck flying Southwest in Africa.


You write as if you always have a choice which airline to use. Southwest is mainly USA domestic with a few vacation destinations nearby USA.


The new system (MCAS) was not intended to take action in normal flight. It was intended the trim in certain scenarios (roughly high AoA + steep turn + high g + near stall), but on lion air 610 the AoA sensor was malfunctioning, which caused MCAS to activate when it shouldn't have.


There already is a switch to disable auto trim. The pilots on the flight prior to 610 used it, as trained. The pilots on 610 did not.


> because Comcast has -bribed- donated to way more congresspeople than google

Is that true? [1] indicates Google spent more than Comcast in lobbying in q1 2015. I have no idea how this varies quarter by quarter and since 2015, but Google spends a lot lobbying.

I would not be surprised if that extends to campaign contributions as well. Google donates a ton of money to a ton of politicians.

[1] https://www.theguardian.com/technology/2015/apr/21/google-co...


While I don’t know what Google spent their money lobbying for, so I’m not going to pick a side in my comment. I’d say the more important thing isn’t who spent the most money but what they spent it on. If for example Google spent some of that money fighting for Net Neutrality then that is better in my mind then Comcast spending most of their money fighting against it. I’m not saying this is the case or that Google has lobbied for good things. But knowing generally what they lobbied for would be a more useful piece of information.


Discrete optimization and automatic differentiation.


Forward AD is the pushforward of a tangent vector (an element of the tangent space), Reverse AD is a pullback of a cotangent vector (an element of the cotangent space). The duality notion between tangent and cotangent spaces is the same as the duality notion of spaces in optimization. Unfortunately, I'm only passingly familiar with discrete optimization, but I would suspect the notion extends from optimization. That's not to say that they are fundamentally the same or that writing this down helps anybody in any way, but a lot of these "dual" notions do have some sort of dual vector space under the hood.


Yeah, but all you're really describing here is linear algebra. Vector spaces and linearity are a significant part of every single discipline the grandparent commenter mentioned, but they picked out duality.

I would agree with the critique: I don't think highlighting duality here is particularly useful. For example, the way dual numbers are used to extend the reals for automatic differentiation doesn't have a deep connection to duality in vector spaces. It's just a very general semantic concept that describes pairs of things. But it doesn't say that any given pair of dual things is related to another pair of dual things.


> For example, the way dual numbers are used to extend the reals for automatic differentiation doesn't have a deep connection to duality in vector spaces.

They don't. Because certain operations are hard to reason about in linear spaces. Such as optimization.

Don't get me wrong, I'm not shitting on vector spaces. All I'm saying is that some problems are hard to do in vector spaces, that are easy in the smooth spaces and vice versa. Like having these two APIs to the same space much more powerful, because again, you generalize over the conversions between the two spaces. You use whichever API is more appropriate in the particular context.

In some sense the linear spaces deal with things like infinity, the smooth spaces deal with cyclical things (signals, wavelets, modular arithmetic).


Quite a bit of optimization is easy to reason about in linear algebra. Take linear and mixed integer programming, for example. And convex optimization subsumes linear optimization in general. There is a lot of nonlinear optimization, but I can assure you with extremely high confidence that the common thread you're seeing here isn't duality, but more abstractly linearity.

Likewise cyclic things show up all the time in purely algebraic (read: discrete, non-smooth) contexts. We have that in vector spaces, group theory, rings, modules, etc.


They show up separately but not in tandem.

The canonical example is robotic motion and the reason why Lie theory is used there. You have very discrete states (positions) that you want to interpolate between smoothly.


> For example, the way dual numbers are used to extend the reals for automatic differentiation doesn't have a deep connection to duality in vector spaces.

Yes, the right way to think about dual numbers (esp once you generalize them beyond just the single e^2=0), is to think of them as tangent vectors (sections of the tangent bundle). I've never really liked the "dual number" terminology here. That's why I deliberately chose to use the duality of forward and reverse mode AD, because that notion of duality agrees with the underlying linear algebra (or in general differential geometry). I do agree it's a mess of terminology.


Gimme five and I'll answer two. There's quite a few pairwise permutations and some are easier to understand and more instructive than others.

Fundamentally, they are both connected via the idea of convex optimization. Automatic differentiation is a computational technique to solve optimization problems.

Yes optimization problems is very general however calculus is a fundamental tool. Dual numbers are somewhat like lie groups, very smooth and conducive to optimization.


Curious, can you expand on the connection to convex optimization? To my understanding, discrete optimization is nonconvex by nature due to discontinuities in the feasible space.


There are two types of spaces, discrete and continuous. These are in a dual relationship.

Duality is the isomorphism between these two. For example, for humans, it's easier to reason about discrete spaces. However a lot of things simply cannot be done that way.

Think of anything that is tangential (pun intended) to Lie theory. In the context of Lie theory, you have the discrete group and the continuous algebra (the group's tangent space). You go between these two using the exponent (group -> algebra) and logarithm to go back (algebra -> group).

It's the difference between an integral and a Rieman sum. It's the fundamental idea that underlies sampling (say audio sampling or even statistical sampling). You capture some invariants and then you interpolate between these invariants to recreate some smooth curve (or distribution).

The nice thing about the smooth space is that optimization is easy. In the exponential space, addition is multiplication and some expensive things are cheap (computationally speaking).


Unless I'm severely misunderstanding you, a discrete set (or function) cannot be a dual of a continuous set (or function). If nothing else, the former is countable and the latter is uncountable; there can be no isomorphism between the two.


Unless I'm severely misunderstanding you, a discrete set (or function) cannot be a dual of a continuous set (or function). If nothing else, the former is countable and the latter is uncountable; there can be no isomorphism between the two.

There is. Discrete samples are samples of the continuous space. In order to capture the continous space, you only really need to capture a particular set of samples that you then interpolate between.

The way I interpret isomorphism in this context is if you can capture one space in the other and then convert to the other without a loss of information.

Imagine a polynomial (in the smooth space). You can capture a particular set of points that uniquely determines the polynomial. In some circumstances you can use these samples to reconstruct the original polynomial by interpolating between any of the two points.


Okay, I think I understand what you're getting at. But if you've taken a set of points from a continuous set (like an interval on the reals) and you can put those in bijection with a discrete set, then by definition your subset of the continuous set isn't continuous. It must be discrete.

More succinctly, you actually can't draw an isomorphism between discrete and continuous spaces without losing information from the continuous space.


Here's the thing, your description of the continuous set is already discretized. If we say an interval 4-6 we have captured the continuous space using only two numbers.

I know that this is a silly argument in some sense but this is something that you do naturally that you don't even think about it.

Do you see what I'm getting at? You capture extrema in the discrete space and the interpolate in the smooth space to recreate the smooth curve.


> Imagine a polynomial (in the smooth space). You can capture a particular set of points that uniquely determines the polynomial. In some circumstances you can use these samples to reconstruct the original polynomial by interpolating between any of the two points.

This is not duality though and you do lose information.

For instance let’s say one has a cubic polynomial and one samples 5 points from it and stores those points. If one didn’t know the original order was cubic, and if one tried to interpolate over the 5 points to fit a quintic, that would be an incorrect reconstruction.


You don't lose information if you pick your points correctly (you store only the extrema). In the cubic case, you need the two extrema (one minimum and one maximum and you need to know whether each extremum is a min or max) and then interpolate between them.


Unfortunately this is incorrect. Extrema do not always exist (consider y=x^3) and they do not uniquely define a polynomial (y=x^2 and y=x^4 both have minima at x=0).


Unfortunately this is incorrect. Those minima are not the same. Remember that dual points have a real part and a dual part that indicates the rate of change at that point. The real part is the same but the dual is different.


My point is it’s not possible to uniquely reconstruct an arbitrary polynomial by just knowing the extrema because there may be information loss in the general case. I will stop here.


It is possible if you know the rate of change which you do with dual points. Like you don’t interpolate just position but also the dual parts I.e. rate of change.


I’m not sure. I’m not entirely convinced that discrete and continuous spaces are dual spaces. They are connected, but they are not duals.

Same with sampling vs continuous. One cannot interchange the order of the composing morphisms while preserving the properties of the original. The sampled object cannot reconstruct the continuous object in all situations due to effects like aliasing.

In optimization, the concept of duality is also a much stronger idea: the primal and the dual of a problem are opposing views of the same problem that correspond exactly (not approximately) in their dual properties.

Discrete optimization is nonconvex by nature (does not satisfy convexity definitions) so I’m not sure if it has any duality relations to convex optimization. There is a relationship but it is not a dual relationship.


Look into Chu spaces.

> One cannot interchange the order of the composing morphisms while preserving the properties of the original.

Good observation one really can't but that was never a hard requirement, right? Ordering becomes actually more interesting because you can have interesting properties like anti-commutativity (https://en.wikipedia.org/wiki/Anticommutativity) which is a lot more useful than commutativity. Lie groups are anti-commutative groups btw.

> In optimization, the concept of duality is also a much stronger idea: the primal and the dual of a problem are opposing views of the same problem that correspond exactly (not approximately) in their dual properties.

My view is more general. The difference between these spaces lies in the idea of choice and in the idea of adversarial choice. You are correct, they are opposing view, like two players playing a game.

I control my moves. I do not have control over my opponents players moves, however I do have knowledge about my opponent's potential moves. Therefore I can do some sort of min-max optimization to figure out my optimal play given my situation and knowing my opponent's options.


But it is. Duality requires commutativity of composition.

It sounds like the ideas that you’ve put forward confounds duality with something else, perhaps transformations. You may be digging a hole here.


> Duality requires commutativity of composition.

Not necessarily.

No, I’m not confounding it.


Apple started opencl too, and that didn't last very long.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: