This was one of those things where everything cleared up once the right questions were asked as opposed to throwing around hypothesis. The function that was created ended up looking like this-
byte charToPattern(byte letter)
{
// Space also doubles as the first shift in a chord
if(letter == 32) { return 64; } //Express convertion: Space
// for all of the key mapping
for (byte i=0; i<PATTERNSIZE; i++)
{
// return typicall letter patterns
if ( letter == ('a'+ i) ) {
return chordPatterns[i];
}
// in numbers shift case return pattern with 6th bit shift
if ( letter < 58 && letter == ('0' + i) ) {
return chordPatterns[i] | 64;
}
// k-y cases ( !"#$%&'()*+'-./ )return 6th bit shift
if ( letter > 32 && letter < 48 && letter == (23 + i) ) {
return chordPatterns[i] | 64;
}
// a-g cases (:;<=>?@ ), return 7th bit shift
if ( letter < 65 && letter == (':' + i) ) {
return chordPatterns[i] | 128;
}
// h-m cases ([\]^_` ), return 7th bit shift
if ( letter > 90 && letter < 97 && letter == (84 + i) ) {
return chordPatterns[i] | 128;
}
// n-q cases( {|}~ ), return 7th bit shift
if ( letter > 122 && letter < 127 && letter == (110 + i) ) {
return chordPatterns[i] | 128;
}
}
return 0;
}
EDIT: I updated your code formatting, edit your post to see how to do it - BDub
Sorry md kinda sucks for keeping code formatting, that looks awful. Anyhow end up using OR instead of bitWrite(). Works in this situation because nether of the bits I want to mess with are ever high to begin with.
Also I want to reiterate the premise to the initial question is completely invalid because of some fluke output. I figured it out! Thanks for your time