General-purpose machine learning algorithms require a large amount of training data. There's no magic that can make accurate predictions without any data to base them on. If you don't have enough data, the information needs to come from somewhere else.
If you have a large amount of data on a similar problem, you can try transfer learning to learn shared properties and only fine-tune the domain-specific stuff on a smaller data set.
If you have no quantitative data, but know domain experts, you can build a custom model based on their advice, with fewer parameters that need to be fit to the data you do have.
But if you have so little data that you can't train a neural network, you can't be getting new data very frequently. It might be cheaper to just pay a human to look at it.
If you don't even have enough data for humans to work with, fancy machine learning isn't going to help you.
Sure there are constraint and pros and cons, but still: pick a neural network for unstructured data. Can always unsupervised pretrain and fine tune on a tiny dataset.
... except that NNs require a large amount of training data.