The first part of your message is pretty clear, but this one isn't.
Do you understand how the RAM in the Apple II is arranged? I thought I was being reasonably clear. In computer-speak the "least significant bit" is the storage element in a byte/word of memory that holds the smallest quantity. IE, a standard "byte" (which conveniently is also the Apple ]['s memory words size) has 8 binary bits capable of storing unsigned integers from 0 to 255. The quantities stored in each bit are, starting from the least significant:
bit 1: 1
bit 2: 2
bit 3: 4
bit 4: 8
bit 5: 16
bit 6: 32
bit 7: 64
bit 8: 128
IE, if you store the number "197" (128+64+4+1) in memory at a given location it's going to be represented by "1"s in bit locations 8,7,3,1 and zeros in 2,4,5,6. The logical arrangement of the 4116 chip is it's a 16k long array of *single bit* memory locations. (IE, each chip is only 1/8th of a full
16kiloBYTEs of ram.) That's why the Apple ][ has three rows of 8 chips.
So... again, I don't know what the output from the MECC program looks like. (If you'd included a screenshot/photo I could possibly interpret it.) I assume it spits out something like: "Error at bit location 0x4567: pattern: 55, read: 45". What this would tell you is the program wrote this out to memory in binary: (I'm putting the least significant bit on the left to match the Apple's physical layout; humans speaking European languages would of course usually write numbers with the LSB on the right)
10101010
But when it read it back it got this:
10100010
From this you'd know, per my example, that the fifth chip over has a bit that's stuck on zero.