Kaynağa Gözat

Fixing base64 decoding error when input string is bad

The following code would crash the previous version when calling MemFree:

	// 53 * A
        const char maliciousBase64Input[] = "AAAAAAAAAAAAAAAAAAAAAAAA"
		"AAAAAAAAAAAAAAAAAAAAAAAAAAAAA";
        int decodedSize = 0;
        unsigned char *decodedData = DecodeDataBase64(
		maliciousBase64Input, &decodedSize);
        if (decodedData) {
		MemFree(decodedData);
        }

The reason is a lack of array bound checks in the decoding loop, which
corrupted here the heap (though this is platform dependent).

Adding the bound checks here prevents the memory corruption.

Tested with encoding random data of sizes 0-1023 and comparing it
with the decoded result.
pull/5170/head
Eike Decker 1 ay önce
ebeveyn
işleme
b53d60d9c6
1 değiştirilmiş dosya ile 13 ekleme ve 2 silme
  1. +13
    -2
      src/rcore.c

+ 13
- 2
src/rcore.c Dosyayı Görüntüle

@ -2671,13 +2671,24 @@ unsigned char *DecodeDataBase64(const char *text, int *outputSize)
for (int i = 0; i < dataSize;)
{
// Every 4 sixtets must generate 3 octets
if (i + 2 >= dataSize)
{
TRACELOG(LOG_WARNING, "BASE64 decoding error: Input data size is not valid");
break;
}
unsigned int sixtetA = base64DecodeTable[(unsigned char)text[i]];
unsigned int sixtetB = base64DecodeTable[(unsigned char)text[i + 1]];
unsigned int sixtetC = ((unsigned char)text[i + 2] != '=')? base64DecodeTable[(unsigned char)text[i + 2]] : 0;
unsigned int sixtetD = ((unsigned char)text[i + 3] != '=')? base64DecodeTable[(unsigned char)text[i + 3]] : 0;
unsigned int sixtetC = (n">i + 2 < dataSize && (unsigned char)text[i + 2] != '=')? base64DecodeTable[(unsigned char)text[i + 2]] : 0;
unsigned int sixtetD = (n">i + 3 < dataSize && (unsigned char)text[i + 3] != '=')? base64DecodeTable[(unsigned char)text[i + 3]] : 0;
unsigned int octetPack = (sixtetA << 18) | (sixtetB << 12) | (sixtetC << 6) | sixtetD;
if (outputCount + 3 > maxOutputSize)
{
TRACELOG(LOG_WARNING, "BASE64 decoding: Output data size is too small");
break;
}
decodedData[outputCount + 0] = (octetPack >> 16) & 0xff;
decodedData[outputCount + 1] = (octetPack >> 8) & 0xff;
decodedData[outputCount + 2] = octetPack & 0xff;

Yükleniyor…
İptal
Kaydet