A central goal in average-case complexity is to understand how average-case hardness can be amplified to near-optimal hardness. Classical results such as Yao’s XOR lemma establish this principle for Boolean functions, but these techniques typically apply only to artificially constructed functions, rather than to natural computational problems. In this work, we extend hardness amplification beyond the Boolean setting and extend the XOR Lemma to the sum of functions over the finite field $\mathbb{F}_p$, where $p$ is a prime. Specifically, we show that if a function $f \colon \{0,1\}^n \to \mathbb{F}_p$ fails to be computed on at least a $\delta$-fraction of inputs, then the $k$-wise sum $f^{+k}(x_1,\dots,x_k) = f(x_1) + \cdots + f(x_k)$ becomes almost optimally unpredictable: no efficient algorithm can compute it with success probability exceeding $\frac{1 + \varepsilon}{p}$ for suitable parameters $k,\delta,\varepsilon$. Our proof is based on the pseudo-average-min entropy characterization of unpredictability due to Zheng (2014) and Vadhan and Zheng (2012), which we simplify and quantitatively refine to make the dependence of the circuit blow-up on all parameters fully explicit.
As an application, we obtain the first error-tolerant random self-reduction for a natural subgraph counting problem. Specifically, we show that any circuit that correctly counts triangles in an Erd?s--Rényi random graph with noticeable probability can be transformed into a worst-case circuit with only a quasi-linear overhead.
We further extend the query lower bound framework of Shaltiel and Viola (2010) to the $\mathbb{F}_p$-valued setting, proving that any (possibly adaptive) black-box hardness amplification over $\mathbb{F}_p$ must make at least $\Omega(p\log(1/\delta)/\varepsilon^2)$ oracle queries. Our proof substantially simplifies the core \emph{fixed-set lemma} underlying previous analyses, offering a more modular and entropy-based argument.