In Spain, there's a betting game called La Quiniela: https://es.wikipedia.org/wiki/La_Quiniela_(Espa%C3%B1a)
Players predict the outcome of 14 football matches (home win, draw, away win). You win money if you get at least 10 correct, and the prize amount depends on the number of winners. Since all bets are public, the number of winners and the corresponding payouts can be estimated for each of the 3^14 possible outcomes. We can also estimate their probabilities using bookmaker odds, allowing us to compute the expected value for each prediction.
As a side project, I wanted to analyze this, but ran into a computational bottleneck: to evaluate a prediction, I had to sum the values of all its Hamming neighbors up to distance 4. That’s nearly 20,000 neighbors per prediction (1+28+364+2912+16016=19321):
S_naive = sum from k=0 to r of [(d! / ((d-k)! * k!)) * (q-1)^k] (d=14, q=3, r=4)
This took days to run in my first implementation. Optimizing and doing it with matrices brought it down to 20 minutes—still too slow (im running it in GAS with 6 minutes limit). For a while, I used a heuristic: start from a random prediction, check its 28 nearest neighbors, move to the highest-value one, and repeat until no improvement is possible within distance 3. It worked surprisingly well.
But I kept thinking about how to solve the problem properly. Eventually, I realized that partial sums could be accumulated efficiently by exploiting overlaps: if two predictions A and B share neighbors, their shared neighbors can be computed once and reused. This is achieved through a basic transformation that I implemented using reshape, roll, and flatten (it is probably not the most efficient implementation but it is the clearest), which realigns the matrix by applying an offset in dimension i. This transformation has two key properties that enable reducing the number of summations from 19,321 to just 101:
- T(T(space, d1), d2) = T(T(space, d2), d1)
- T(space1, d) + T(space2, d) = T(space1+space2, d)
Number of sums would be the result of this expression:
S_PSA = 1 + (d - (r-1)/2) * r * (q-1)
I've generalized the algorithm for any number of dimensions, elements per dimension, and summation radius. The implementation is in pure NumPy. I have uploaded the code to colab, github and an explanation in my blog.
Apparently, this falls under Hamming neighbor summation, but I haven't found similar approaches elsewhere (maybe I'm searching poorly). If you know or you've worked on something similar, I'd love to hear your thoughts!
colab: https://colab.research.google.com/drive/1aENKd7eemGqmjdB8Y6y...
github: https://github.com/petopello/PSA
blog: https://sudapollismo.substack.com/p/partial-sum-accumulation...
Comments URL: https://news.ycombinator.com/item?id=43210185
Points: 14
# Comments: 0
Melden Sie sich an, um einen Kommentar hinzuzufügen
Andere Beiträge in dieser Gruppe

Article URL: https://www.construction-physics.com/p/why-its-so-hard-to-build-a-jet-engine
Comments

Article URL: https://github.com/cmackenzie1/torii-rs
Comments URL: https://news


Some background: I work on Langfuse and we've been collaborating with LiteLLM.
(LiteLLM is a Python library and proxy/gateway that handles cost management, virtual keys, caching, and rate-limiti