How do we define natural medicine? Unfortunately there are no legal requirements on the meaning of words like this, so companies can use them in any way they choose.
The net result is that terms like natural medicine become devalued, and we don't trust them. However, what the term natural medicine should refer to is food or other ingredients that have the ability to cure disease or to restore good health.
A definition such as this extends the concept of what helps and heals us to include a whole range of foods and ingredients that nature offers us.
It is also a concept that requires us to take much greater responsibility for our own health - this is not about going to the doctor every time we are unwell and expecting an instant cure, it is about accepting that we are responsible for our own health, and that the actions we take on a daily basis will have consequences - either positive or negative.
We therefore have to learn about what whole foods do for us, and the ways in which traditional healing practices can help us keep our good health without involving intrusive practices such as surgery and powerful pharmaceuticals.
Ultimately is is about making lifestyle choices that have a positive impact on our physical, nutritional, and emotional well-being, in order that we can live, act and feel our best.