Challenges in join optimization

(starrocks.io)

34 points | by HermitX 6 hours ago

2 comments

  • j-pb 1 hour ago
    Whenever I read join optimisation articles in SQL based systems it feels... off.

    There is too much heuristic fiddling involved, and way too many niche algorithms that get cobbled together with an optimiser.

    As if we're missing the theory to actually solve the stuff, so we're instead hobbling along by covering as many corner cases as we can, completely missing some elegant and profound beauty.

    • Sesse__ 49 minutes ago
      This post certainly has too much heuristic fiddling! Instead of a coherent framework, it takes a bunch of second-rate heuristics and tries to use… well, all of them. “Generate at most ten plans of this and one of that”? It also has pages and pages talking about the easier parts, for some reason (like maintaining maps, or that a Cartesian product and an inner join are basically the same thing), and things that are just wrong (like “prefer antijoins”, which is bad in most databases since they are less-reorderable than almost any other join; not that you usually have much of a choice in choosing the join type in the first place).

      There _are_ tons of corner cases that you need to address since there are some super-hard problems in there (in particular, robust cardinality estimation of join outputs is a problem so hard that most of academia barely wants to touch it, despite its huge importance), but it doesn't need to be this bad.

      • willvarfar 18 minutes ago
        Can join cardinality can be tackled with cogroup and not expanding the rows until final write?
    • jasonwatkinspdx 29 minutes ago
      Optimal join order is NP-Hard.
    • akoboldfrying 15 minutes ago
      Well, it's to be expected that heuristics are needed, since the join ordering subproblem is already NP-hard -- in fact, a special case of it, restricted to left-deep trees and with selectivity a function of only the two immediate child nodes in the join, is already NP-hard, since this is amounts to the problem of finding a lowest-cost path in an edge-weighted graph that visits each vertex exactly once, which is basically the famous Traveling Salesperson Problem. (Vertices become tables, edge weights become selectivity scores; the only difficulty in the reduction is dealing with the fact that the TSP wants to include the cost of the edge "back to the beginning", while our problem doesn't -- but this can be dealt with by creating another copy of the vertices and a special start vertex, ask me for the details if you're interested.)
  • Ibrahim26 56 minutes ago
    Yeah Bro I'm a student, so let me learn something new .