There is no need for binary literals. Every low level programmer should know the first 16 hexadecimals in his sleep.
With that you can simply do char c = 0x5A. It is more readable since there are more distinguishing characters and is shorter. I actually made a mistake when I first scanned your numbers and had to look carefully. Imagine if the number was a 64bit integer.
Also <bitset>, if you intend to use c++ correctly.
You shouldn't use binary everywhere, just like you shouldn't just hex everywhere. Binary literals make sense for bit data. For example, coding an icon for a game:
With that you can simply do char c = 0x5A. It is more readable since there are more distinguishing characters and is shorter. I actually made a mistake when I first scanned your numbers and had to look carefully. Imagine if the number was a 64bit integer.
Also <bitset>, if you intend to use c++ correctly.