An $(n,k)$-bit-fixing source is a distribution on $n$ bit strings, that is fixed on $n-k$ of the coordinates, and jointly uniform on the remaining $k$ bits. Explicit constructions of bit-fixing extractors by Gabizon, Raz and Shaltiel [SICOMP 2006] and Rao [CCC 2009], extract $(1-o(1)) \cdot k$ bits for $k = \mathrm{polylog}(n)$, almost matching the probabilistic argument. Intriguingly, unlike other well-studied sources of randomness, a result of Kamp and Zuckerman [SICOMP 2006] shows that, for any $k$, some small portion of the entropy in an $(n,k)$-bit-fixing source can be extracted. Although the extractor does not extract all the entropy, it does extract $(1/2 - o(1)) \cdot \log(k)$ bits.
In this paper we prove that when the entropy $k$ is small enough compared to $n$, this exponential entropy-loss is unavoidable. More precisely, one cannot extract more than $\log(k)/2 + O(1)$ bits from $(n,k)$-bit-fixing sources. The remaining entropy is inaccessible, information theoretically. By the Kamp-Zuckerman construction, this negative result is tight.
Our impossibility result also holds for what we call zero-fixing sources. These are bit-fixing sources where the fixed bits are set to $0$. We complement our negative result, by giving an explicit construction of an $(n,k)$-zero-fixing extractor, that outputs $\Omega(k)$ bits, even for $k = \mathrm{polyloglog}(n)$. Furthermore, we give a construction of an $(n,k)$-bit-fixing extractor, that outputs $k-O(1)$ bits, for entropy $k = (1+o(1)) \cdot \log\log{n}$, with running-time $n^{O(( \log{\log{n}})^2)}$.