Show Posts

This section allows you to view all posts made by this member. Note that you can only see posts made in areas you currently have access to.

Topics - jessica

Pages: [1] 2 3
Hot Topics / The opposition.
« on: September 17, 2015, 12:39:11 am »
I cant even get past the first sentence.  Vegans, people who think they can have this totally sanitary version of life and diet......telling people its okay to eat GMO's.  Veganism is a sure way to destroy nature and humanity.

Sep 15 at 10am
 share on facebook  share on twitter  send to a friend
I’m a vegan, which means I don’t eat meat, eggs, or dairy. It’s a philosophical stance, since I believe it’s wrong to butcher sentient beings for food. I’m also a scientist. I studied genetics in college, and now I help build and fund biotech start-ups. I work in downtown San Francisco and live in Oakland (which I love).

Living here in the Bay Area, I see a lot of confusion about genetically modified foods. For instance, some people think that if you eat non-GMO produce, it means you’re eating natural, but that’s wrong (fancy scientists would call this thinking the “naturalistic fallacy”).

Almost nothing you or I eat is actually natural. Our ancestors have been selectively breeding plants for at least 20,000 years, and in that time, they’ve developed most of the foods we know, love, and eat today. I hate to be the one to say it, but our food is almost all artificial—the corn you eat, the bananas you enjoy, even your broccoli and cauliflower. None of those foods is natural; they were altered by our ancestors.

And no, eating a Paleo diet does not get you closer to how we evolved to eat. The cows you eat today aren’t bison, and those pigs aren’t boars. Look again (and watch your cholesterol). Primates don’t eat that much meat!

Even more importantly, the practice of agriculture is also unnatural. Those organic apples you love are made from clones, grown artificially in fields that are artificially irrigated and artificially treated with biopesticides to prevent the destruction of crops by insects. In fact, the same genes that are inserted into GMO corn, an insecticide called Bt, is sprayed on your organic apples. It’s the exact same biopesticide. Think about that for a second, and then do a double take the next time you hear or read organic marketing materials. I repeat: it’s the same thing. Look it up!

Remember those biblical plagues of locusts and the famines that followed? Where did they go? That’s right—science and modern agriculture eradicated them (for now). If we were to grow food naturally, the organic produce wouldn’t even last one growing cycle—the plants would just be sticks in a field.

So why are so many people anti-GMO? Is it because the food is toxic? No, there’s no substantive data to support this perception, and no, you should not believe any of those sketchy blog sources with conspiracy theories. The reality of it is, if you believe climate change is real because of the overwhelming data, then why wouldn’t you believe GMOs are safe due to the overwhelming data in support of them? I’d encourage you to dive deeper into the science, specifically a scientific paper published in 2013 that reviewed 10 years of scientific data and peer-reviewed published papers about the safety of GMO crops and found that “scientific research conducted so far has not detected any significant hazards directly connected with the use of GE [genetically engineered] crops.” That is, GMO crops are safe!

“Ahh,” you’re thinking, “but the EU has banned GMOs. Gotcha!”

The EU’s GMO ban is nothing but a political smoke screen. In reality, the EU eats a ton of GMO products. Take cheese, for instance. Do you love your British Cheddar? If so, you’re eating a GMO—microbially derived rennet, an enzyme needed for cheese production in the vast majority of cheeses. If we were to get this enzyme naturally, it would come from the stomachs of dead calves (no joke). As a result of the GMO rennet, cheese is now vegetarian and a lot less gross (once you know how it’s made). So the anti-GMO posture of the EU is mostly just political theater.

So why, specifically, am I a pro-GMO vegan? Because I believe that mindfulness, knowledge, and science are more important than political games. I am against the use of sentient beings in the production of our food, and I think we can use science to give everyone what they want—milk, cheese, eggs, and even real meat—without harming animals.

My hope and aim through my work and the work of others is that one day (in a few years), you’ll all be pro-GMO vegans. Even if you keep eating steak, one day it will be lab grown, rather than sourced from suffering animals in cages—and that will be just fine by me. I might even have a few bites. Yep, real vegan steak. Ponder that for a while!


A creepy, cloaked figure that's lurking around a North Carolina community and reportedly dropping raw meat in playgrounds has residents on edge.

Photos of the mysterious person covered in a dark, hooded cloak surfaced on social media earlier this month and went viral after sparking fear among some in the Gastonia neighborhood where the images were believed to be snapped.

"I see why it could be easy to make a joke out of it, but this is serious," resident Brooke Conrad told The Gaston Gazette. "We live in a world today where you don't know what's going on and you don't know people."

It's unclear if the figure is a man or a woman. The pale person was snapped standing by trees outside the Hudson Creek apartment complex, the local paper reported.

Some say they have seen the person dropping meat near a playground. An unnamed resident told the newspaper he found a bag of raw meat near the apartment two weeks ago.

A Reddit user who posted the photo on the popular networking site said the husband of the photographer of the original image said the figure would "raise its hands upwards and then down."

Police have not confirmed the meat rumor.

"We don't know if this is some bogus prank somebody is playing," Gastonia Police Department spokeswoman Donna Lasher told The Gazette.

Meanwhile, residents are left scratching their heads and demanding answers.

"This isn't just teenagers playing a joke in Gastonia a woman also saw someone just like this doing the same thing two hours away in their town as well," local Kelsie Cooper wrote on Facebook.

"This is a serious thing that needs to be addressed."


"L-arginine is found naturally in red meat, poultry, fish and dairy products
The amino acid was found to break down dental plaque which is known to trigger gum disease, as well as dental cavities"

General Discussion / Edibity of raw meat discussed on major website
« on: August 09, 2014, 04:15:28 am »
I think it's interesting, disheartening and frustrating to read through the comments, but this is what "educated" middle to upper class America thinks about raw meat

Might our posterchild Derek consider doing an Internet q and a in he "ask me anything" section of reddit?  It's funny this website is user based but ends up being a huge news source for lazy anchors across the country.

Off Topic / Footage from fishing village
« on: July 04, 2014, 10:27:51 am »


Japanese people have special tools that let them get more out of eating sushi than Americans can. They are probably raised with these utensils from an early age and each person wields millions of them. By now, you’ve probably worked out that I’m not talking about chopsticks.

The tools in question are genes that can break down some of the complex carbohydrate molecules in seaweed, one of the main ingredients in sushi. The genes are wielded by the hordes of bacteria lurking in the guts of every Japanese person, but not by those in American intestines. And most amazingly of all, this genetic cutlery set is a loan. Some gut bacteria have borrowed their seaweed-digesting genes from other microbes living in the coastal oceans. This is the story of how these genes emigrated from the sea into the bowels of Japanese people.

Within each of our bowels, there are around a hundred trillion microbes, whose cells outnumber our own by ten to one. This ‘gut microbiome’ acts like an extra organ, helping us to digest molecules in our food that we couldn’t break down ourselves. These include the large carbohydrate molecules found in the plants we eat. But marine algae – seaweeds – contain special sulphur-rich carbohydrates that aren’t found on land. Breaking these down is a tough challenge for our partners-in-digestion. The genes and enzymes that they normally use aren’t up to the task.

Fortunately, bacteria aren’t just limited to the genes that they inherit from their ancestors. Individuals can swap genes as easily as we humans trade money or gifts. This ‘horizontal gene transfer’ means that bacteria have an entire kingdom of genes, ripe for the borrowing. All they need to do is sidle up to the right donor. And in the world’s oceans, one such donor exists – a seagoing bacterium called Zobellia galactanivorans.

Zobellia is a seaweed-eater. It lives on, and digests, several species including those used to make nori. Nori is an extremely common ingredient in Japanese cuisine, used to garnish dishes and wrap sushi. And when hungry diners wolfed down morsels of these algae, some of them also swallowed marine bacteria. Suddenly, this exotic species was thrust among our own gut residents. As the unlikely partners mingled, they traded genes, including those that allow them to break down the carbohydrates of their marine meals. The gut bacteria suddenly gained the ability to exploit an extra source of energy and those that retained their genetic loans prospered.

This incredible genetic voyage from sea to land was charted by Jan-Hendrik Hehemann from the University of Victoria. Hehemann was originally on the hunt for genes that could help bacteria to digest the unique carbohydrates of seaweed, such as porphyran. He had no idea where this quest would eventually lead. Mirjam Czjzek, one of the study leaders, said, “The link to the Japanese human gut bacteria was just a very lucky, opportunistic hit that we clearly had no idea about before starting our project. Like so often in science, chance is a good collaborative fellow!”

Hehemann began with Zobellia, whose genome had been recently sequenced. This bacterium turned out to be the proud owner of no fewer than five porphyran-breaking enzymes. These enzymes were entirely new to science, they are all closely related and they clearly originated in marine bacteria. Their unique ability earned them the name of ‘porphyranases’ and the genes that encode them were named PorA, PorB, PorC and so on.

They are clearly not alone. Using his quintet as a guide, Hehemann found six more genes with similar abilities. Five of them hailed from the genomes of other marine bacteria – that was hardly surprising. But the sixth source was a far bigger shock: the human gut bacterium Bacteroides plebeius. What was an oceanic gene doing in such an unlikely species? Previous studies provided a massive clue. Until then, six strains of B.plebeius had been discovered, and all of them came from the bowels of Japanese people.

Nori is, by far, the most likely source of bacteria with porphyran-digesting genes. It’s the only food that humans eat that contains any porphyrans and until recently, Japanese chefs didn’t cook nori before eating it. Any bacteria that lingered on the green fronds weren’t killed before they could mingle with gut bacteria like B.plebius. Ruth Ley, who works on microbiomes, says, “People have been saying that gut microbes can pick up genes from environmental microbes but it’s never been demonstrated as beautifully as in this paper.”

In fact, B.plebeius seems to have a habit of scrounging genes from marine bacteria. Its genome is rife with genes that are more closely related to their counterparts in marine species like Zobellia than to those in other gut microbes. All of these borrowed genes do the same thing – they break down the complex carbohydrates of marine algae.

To see whether this was a common event, Hehemann screened the gut bacteria of 13 Japanese volunteers for signs of porphyranases. These “gut metagenomes” yielded at least seven potential enzymes that fitted the bill, along with six others from another group with a similar role. On the other hand, Hehemann couldn’t find a single such gene among 18 North Americans. “We were trying at lunch to think about where you might see patterns this clean,” says Ley. “You’d have to find another group of people with a very specialised diet.  Because this involved seaweed and marine bacteria, it might be one of the cleanest demonstrations you’d get.”

For now, it’s not clear how long these marine genes have been living inside the bowels of the Japanese. People might only gain the genes after eating lots and lots of sushi but Hehemann has some evidence that they could be passed down from parent to child. One of the people he studied was an unweaned baby girl, who had clearly never eaten a mouthful of sushi in her life. And yet, her gut bacteria had a porphyranase gene, just as her mother’s did. We already known that mums can pass on their microbiomes to their children, so if mummy’s gut bacteria can break down seaweed carbs, then baby’s bugs should also be able to.

Are we what we eat?

This study is just the beginning. Throughout our history, our diet has changed substantially and every mouthful of new food could have acted as a genetic tasting platter for our gut bacteria to sample. Personally, I’ve been eating sushi for around two years ago and I was intrigued to know if my own intestinal buddies have gained incredible new powers since then. Sadly, Czjzek dispelled my illusions. “Today, sushi is prepared with roasted nori and the chance of making contact with marine bacteria is low,” she said. The project’s other leader, Gurvan Michel, concurs. He notes that of all the gut bacteria from the Japanese volunteers, only B.plebeius as acquired the porphyranase enzymes. “This horizontal gene transfer remains a rare event,” he says.

Michel also says that for these genes to become permanent fixtures of the B.plebeius repertoire, the bacterium would have needed a strong evolutionary pressure to keep them. “Daily access to ingested seaweeds as a carbon source” would have provided such a pressure. My weekly nibbles on highly sterile pieces of sushi probably wouldn’t.

That’s one question down; there are many to go. How did the advent of agriculture or cooking affect this genetic bonanza? How is the Western style of hyper-hygienic, processed and mass-produced food doing so now? As different styles of cuisines spread all over the globe, will our bacterial passengers also become more genetically uniform?

The only way to get more answers is to accelerate our efforts to sequence different gut microbiomes. Let’s take a look at those of other human populations, including hunter-gatherers. Let’s peer into fossilised or mummified stool samples left behind by our ancestors. Let’s look inside the intestines of our closest relatives, the great apes. These investigations will tell us more about the intestinal genetic trade that has surely played a big role in our evolution.

Rob Knight, a microbiome researcher from the University of Colorado, agrees. “This result reinforces the need to conduct a broad and culturally diverse survey of who harbours what microbes. The key to understanding obesity or IBD might well be in genes or microbes acquired under circumstances very different to those we experience in Western society.” Gastronomics, anyone?

Reference: Hehemann, J., Correc, G., Barbeyron, T., Helbert, W., Czjzek, M., & Michel, G. (2010). Transfer of carbohydrate-active enzymes from marine bacteria to Japanese gut microbiota Nature, 464 (7290), 908-912 DOI: 10.1038/nature08937

more articles about gut bacteria:

Hot Topics / Brain food: the history of skull drinking
« on: April 23, 2014, 07:40:41 am »

Brain food: the history of skull drinking

The Cheddar cave dwellers who used skulls as drinking cups were in good company – many have gone much further
Richard Sugg

 Friday 18 February 2011 08.29 EST   

It is now more than 15 years since I paid a visit to the sleepy town of Sedlec, just outside Prague. But I can still vividly recall the strangeness of leaving the sunlit graveyard to descend into a church where huge bell shapes had been formed from human skulls and bones, along with a skeletal coat of arms and a chandelier fashioned from every bone in the human body. The skeletons had been disinterred because the site was so popular as a burial place, having been supposedly sprinkled with earth from Golgotha in about 1278. The Kostnice ossuary is a striking example of how the sacred can legitimise seemingly macabre or taboo uses of the body.

This week, Ian Sample has reported on a more primitive but broadly similar use of the human skull in caves in Cheddar, Somerset, where research shows  they were "skilfully fashioned into cups, with the rest of the bodies probably being cannibalised". When body parts become artistic or domestic objects we find ourselves caught between repulsion and ambivalent fascination – especially when a skull is used as a drinking vessel.

This friction between "civilised" artistry and "savage" man-eating looks rather different if we remind ourselves of the very strange things done with human skulls and bodies by educated Christians during the peak of Britain's artistic and scientific revolutions. For much of the past 500 years, the big question regarding skulls has not been "should you drink from them?" so much as "should you drink the skull itself?"

Reasons for doing so were usually medical or magical, but there were other motivations. The "barbarous Scythians", for example, were held to eat their enemies and drink blood from their skulls. Later, in a notorious Renaissance tale, an adulteress was compelled by her husband to drink from the gilded skull of her lover at supper each night, and the skull of James IV of Scotland was used as a flowerpot in the English royal conservatory. Opposed to these kind of negative, punitive uses existed a range of positive ones: at least since Herodotus, it has been reported that the Issedones decorated with gold the skulls of their dead parents, using them as commemorative drinking cups in following years.

In the 17th century, privileged medical patients paid very high sums not to drink from skulls, but to drink the skull itself. Skull could be taken either powdered or in the more refined form of a liquid distillation. It was taken and indeed made by Charles II – a figure who, having paid perhaps thousands for the recipe, became so closely associated with this therapy that it was soon known as "the king's drops". These were used on Charles's own deathbed in 1685, and on that of Queen Mary in 1698.

Others went further: should you drink only the skull, or the whole head? The Belgian chemist Jean Baptiste Van Helmont believed that you should allow the brain matter to dissolve into the skull, which – nicely marinaded over time – then absorbed the body's vital powers. And if you balked at this, you could still use a kind of moss found on unburied skulls. Usually powdered, this was thrust into the nostrils of those suffering nosebleeds: Robert Boyle, among others, swore by it. Such treatments – part of a widespread tradition of "medicinal cannibalism" using flesh, fat, blood and bone – were so popular that come the 18th century, there were customs duties on skulls imported from the battlefields of Ireland.

Although the educated had abandoned such habits by about 1900, the poor had not. In Bradford in 1847, a father gave his daughter grated skull for her epilepsy and in Ruabon, Wales, a mother did the same in 1865. But the most enduring piece of "skull medicine" in the UK is one that very closely resembles the habits of the Cheddar cave dwellers. It was believed that certain conditions could be cured if the sufferer drank from the skull of "a suicide". Such use was recorded in England in 1858, while in the Highland parish of Nigg in the 19th century an epileptic boy was given powder from the skull of someone who had killed themselves – to obtain which, "a journey of well over 60 miles had to be made". Mary Beith tells of how, as late as 1909, a Scots epileptic resorted to a healer in Lewis after two years of professional treatment in Edinburgh: "The sufferer was also directed to drink out of a copann-cinn (skull-pan) taken from an old cemetery on a small island, which he did for some weeks, reporting ... that 'the peculiar taste was fresh in the mouth the next morning as it was on the previous night'."

Clearly, then, "civilised" people could do far more surprising things with corpses than the savage cannibals of Cheddar. And we can also now see two possible reasons why the cavemen made skull-cups: it may well have been to honour and remember their dead; but it may also have been to try and imbibe magical or healing powers from their dry bones.


Bones from a Cheddar Gorge cave show that cannibalism helped Britain's earliest settlers survive the ice age

New carbon dating techniques reveal that 14,700 years ago humans living in Gough's Cave in the Mendips acquired a taste for the flesh of their relatives, and not just for ritual reasons
Robin McKie, science editor

The Observer, Saturday 19 June 2010   
Scientists have identified the first humans to recolonise Britain after the last ice age. The country was taken over in a couple of years by individuals who practised cannibalism, they say - a discovery that revolutionises our understanding of the peopling of Britain and the manner in which men and women reached these shores.

Research has shown that tribes of hunter-gatherers moved into Britain from Spain and France with extraordinary rapidity when global warming brought an end to the ice age 14,700 years ago and settled in a cavern – known as Gough's Cave – in the Cheddar Gorge in what is now Somerset.

From the bones they left behind, scientists have also discovered these people were using sophisticated butchering techniques to strip flesh from the bones of men, women and children.

"These people were processing the flesh of humans with exactly the same expertise that they used to process the flesh of animals," said Professor Chris Stringer of the Natural History Museum in London. "They stripped every bit of food they could get from those bones."

The discovery of the speed of Britain's recolonisation after the last ice age, and the disquieting fate of some of those first settlers, is the result of two major technological breakthroughs. The first involves the development of a technique known as ultra-filtration carbon dating. Perfected by scientists based at Oxford University's radiocarbon accelerator unit, it allows researchers to pinpoint the ages of ancient bones and other organic material with unprecedented accuracy.

The second breakthrough involves the use of a machine known as the Alicona 3D microscope. Using this device, Dr Silvia Bello of the Natural History Museum has studied the cut marks left on bones of humans and animals in Gough's Cave. Scientists already knew cannibalism had been practised in the cavern, but were unclear if it was a ritual process or involved the deliberate killing of humans. However, Bello has found humans had been butchered with the same stone tools that had been used to cut up animals. In other words, animal and human flesh was treated the same way by these early Britons.

In addition to these findings, the discovery – by Danish scientists a few years ago – that the last ice age ended with astonishing rapidity has also played a key role in reappraising the recolonisation of Britain. Far from being a gradual process, in which men and women slowly reoccupied territory that had been taken from them by spreading glaciers, the resettling of Britain now appears to have been rapid, dramatic and bloody.

For around 60,000 years the planet had shivered as ice sheets fluctuated over large parts of the northern and southern hemisphere – including Britain, then a peninsula of northern Europe, which supported a small population of humans for much of this time. However, around 24,000 years ago, the weather worsened drastically. Britain's last inhabitants either died out or headed southwards for some continental warmth in refuges in northern Spain and central France.

Britain's icy desolation ended abruptly 14,700 years ago when there was a dramatic leap in temperatures across the globe according to ice-cores found in Greenland and lake sediments in Germany. In less than three years, temperatures had soared by around 6 to 7 degrees Celsius and ice sheets began a rapid retreat throughout the world.

Such a jump in temperature brought about an astonishing change in the world's weather patterns – though the underlying cause remains unclear, scientists admit. Suggestions include the proposal that variations in the orbit of the Earth around the Sun allowed more solar radiation to bathe the planet and so warm it up. It has also been proposed that there may have been a sudden eruption of carbon dioxide from the oceans. This helped trap heat from the sun in the atmosphere and so heat up the world.

"Whatever the reason, it was good news in those days, because the world was so cold and so it heated up nicely. However, if a rise like that happened today it would be devastating," said Dr Tom Higham, deputy director of the Oxford radiocarbon unit. "The world would be scorched. That is one of the most important aspects of the story of the resettling of Britain."

Higham's work, in collaboration with his late colleague Roger Jacobi, has involved studying the ages of the bones found at Gough's Cave in the Somerset Mendips, the earliest post-ice age site at which modern human remains have been found. The bones of half a dozen people – including children, adolescents and adults – were found in the cave in the 1980s, a discovery that made national headlines when it was revealed that these remains bore patterns of cut marks that suggested they had been the victims of cannibalism.

Other sites of this antiquity, in Germany and France, have also supplied evidence that human bones had been butchered. But the Gough's Cave finds were puzzling because radiocarbon dates indicated that humans had used the cave for more than 2,000 years, including several centuries in which the country would have been covered in ice sheets.

"The problem with radiocarbon dates of this antiquity is that it only takes a tiny trace of contamination from modern organic material to distort results," said Higham. "That is why we kept getting such a range of ages from the Gough's Cave bones."

To get round this problem, Jacobi and Higham worked on a technique – known as ultra-filtration – which involves using a series of complex chemical treatments to destroy any modern contamination in samples taken from the cave. First results of dates supplied using this technique were published by the scientists in a paper in Quaternary Science Reviews last year and were based on their re-analysis of the bones of Gough's Cave. These revealed a very different picture for the ages for the bones than had previously been calculated.

Instead of dates being spread over a couple of thousand years, the new ones clustered tightly round an age of 14,700 years before present – the exact moment that the world had begun its dramatic defrosting. Within a year or two, humans had left their southern refuges and were heading north into Britain, it was revealed. In other words, the end of the ice age was almost instantaneous – and so was the manner in which we exploited it.

In those days, humans were nomadic hunter-gatherers: strong, relatively well-nourished individuals who followed the herds of wild horses that then roamed Europe. These animals provided men, women and children with their main source of protein. "The weather suddenly got warm, the horses headed north and men and women followed them," said Higham. "It would have been a very rapid business."

As for the route of this migration, it probably took these ancient hunter-gatherers across Doggerland – a now submerged stretch of land in the North Sea that is known as Dogger Bank today – and into eastern England. Within a couple of years, they had reached Gough's Cave, though the cavern would not have formed a permanent residence but would most likely have served as a refuge to which they could return on a regular basis.

Previously it had been thought that the cave had been occupied, on or off, for around 2,000 years. However, the new set of dates generated by Higham shows that these not only cluster round the date of 14,700 years before the present, but that they cover only a very narrow range of about a hundred years or less. In other words, the cave was occupied for only a few generations at that time.

However, it is the behaviour of those few generations that has perplexed scientists for the past 20 years and which led to the new investigation by Bello. "The bone fragments we have found suggest we are looking at the remains of five individuals," she said. "These remains include one young child, aged between three and four, two adolescents, a young adult and an older adult. So we have every kind of age group represented in the Gough's Cave remains."

Bello has found that each of these sets of remains is covered with marks that show they had been the subject of comprehensive butchery, with all muscle and tissue being stripped from them. But why de-flesh those bones in the first place? What triggered such an extreme act? To provide answers, scientists have put forward a number of different theories. These include suggestions that it was a form of ritual which involved the eating of small pieces of a relative's flesh, not as a source of nutrition, but as an act of homage.

Others have argued that it involved a form of crisis cannibalism in which people ate the flesh of others because all other sources of food had disappeared. "An example of that sort of cannibalism was provided by the Andes air crash in 1972 when survivors ate the flesh of those who had been killed in the accident," said Stringer.

And finally there is straightforward cannibalism in which humans hunt, kill and eat other humans because they have a preference for human flesh. This is sometimes known as homicidal cannibalism.

The new evidence that is emerging from Bello's work does not resolve the issue, though some significant pointers have been uncovered. "These people were breaking up bones to get at the marrow inside," she said. "They were stripping off all of the muscle mass. Brains seemed to have been removed. Tongues seemed to have been removed. And it is also possible that eyes were being removed. It was very systematic work." In addition, human remains appear to have been disposed of in the same way as animal bones, by being dumped in a single pit.

Such evidence suggests straightforward cannibalism was carried out in Gough's Cave. However, there are other factors to note, said Bello. "These were very difficult times and it is still quite possible people ate each other because there simply wasn't anything else to eat." The landscape – although rapidly recovering – would still have been pretty barren, particularly in winter.

In addition, Bello also pointed out that the remains of only a few individuals had been found at Gough's Cave. In other words, there is no evidence that large-scale human butchery had been practised there. "That means we cannot completely rule out the possibility that this was some form of ritual cannibalism, although I think it is unlikely," said Bello.

At present, most evidence indicates that humans were probably using the skills that they had acquired in butchering animal flesh, in particular the meat of horses as well as reindeer, another stone age favourite, in order to cut up humans who had died of natural causes.

"We don't see any traumatic wounds in these remains which would suggest violence was being inflicted on living people. This was some kind of cultural process that they brought with them from Europe," she said.

Whatever the nature of the cannibalism that was carried out by these early settlers, it did them little good in the end. Two thousand years after the ice age ended, Europe was plunged into a new, catastrophic freeze. A massive lake of glacial meltwater built up over northern America. Then it burst its banks and billions of gallons of icy water poured into the north Atlantic, deflecting the Gulf Stream. Temperatures in Britain plunged back to their ice age levels and the country was once again completely depopulated.

"This new period of intense cold lasted for more than a thousand years," said Stringer. "Only by 11,500 years did conditions start to return to their present level – and Britain was colonised by humans for the last time."

• This article was amended on 23 June 2010 to correct the spelling of Dr Silvia Bello's name.


Radiocarbon dating was developed in 1949 and used to give dates for ancient Egyptian sites. These fitted well with previous age estimates, earning its developer – Willard Libby, of Chicago University – a Nobel prize. Since then the technique has played a pivotal role in archaeology.

The technique has some unworldly roots, however, and exploits the fact that cosmic rays strike atoms of nitrogen in the upper atmosphere, transforming them into an isotope of carbon called carbon-14. This is radiocarbon and chemically identical to normal carbon. Both are absorbed into the bodies of living beings, a process that stops when an organism dies. Its store of carbon-14 begins to decay back into nitrogen. After about 5,700 years, half is left. Then after 11,400 years, a quarter is left, and after about 17,000 years an eighth remains. By measuring a piece of organic material's radiocarbon content, its age can then be calculated.

There are drawbacks, however. After 35,000 years, only 2% of a sample's radiocarbon will remain. Not only is this tricky to measure, it puts a sample at risk of contamination. If an impurity of only 1% of new carbon – for example, from plant material - is added, it would appear 4,000 years younger. This has bedevilled research into human origins in Europe and explains why radiocarbon dates for the Gough's Cave bones from the Cheddar Gorge have produced such variable results.

To get round this problem, Tom Higham uses the ultra-filtration technique. Long strands of chemicals making up collagen in bone samples are isolated while shorter sections – found in plants and other sources of contamination - are removed. Combined with new mathematical techniques, this technology is giving scientists new precision in pinpointing ages of bones and skulls

General Discussion / Protein Bars made with Cricket Flour
« on: March 17, 2014, 02:23:10 am »


Matt Collins
The last time I visited Boston's Museum of Fine Arts was in 2004 to see a Rembrandt exhibition. But I might have wandered away from the works of the Dutch master in search of an ancient Greek artifact, had I known at the time that the object in question, a wine vessel, was in the museum's collection. According to the 2012 Christmas issue of the BMJ (preacronymically known as the British Medical Journal), the 2,500-year-old cup, created by one of the anonymous artisans who helped to shape Western culture, is adorned with the image of a man wiping his butt.

That revelation appears in an article entitled “Toilet Hygiene in the Classical Era,” by French anthropologist and forensic medicine researcher Philippe Charlier and his colleagues. Their report examines tidying techniques used way back—and the resultant medical issues. Such a study is in keeping with the BMJ's tradition of offbeat subject matter for its late December issue—as noted in this space five years ago: “Had the Puritans never left Britain for New England, they might later have fled the British Medical Journal to found the New England Journal of Medicine.”

The toilet hygiene piece reminds us that practices considered routine in one place or time may be unknown elsewhere or elsetime. The first known reference to toilet paper in the West does not appear until the 16th century, when satirist François Rabelais mentions that it doesn't work particularly well at its assigned task. Of course, the ready availability of paper of any kind is a relatively recent development. And so, the study's authors say, “anal cleaning can be carried out in various ways according to local customs and climate, including with water (using a bidet, for example), leaves, grass, stones, corn cobs, animal furs, sticks, snow, seashells, and, lastly, hands.” Sure, aesthetic sensibility insists on hands being the choice of last resort, but reason marks seashells as the choice to pull up the rear. “Squeezably soft” is the last thing to come to mind about, say, razor clams.

Charlier et al. cite no less an authority than philosopher Seneca to inform us that “during the Greco-Roman period, a sponge fixed to a stick (tersorium) was used to clean the buttocks after defecation; the sponge was then replaced in a bucket filled with salt water or vinegar water.” Talk about your low-flow toilets. The authors go on to note the use of rounded “fragments of ceramic known as ‘pessoi’ (meaning pebbles), a term also used to denote an ancient board game.” (The relieved man on the Museum of Fine Arts's wine cup is using a singular pessos for his finishing touches.) The ancient Greek game pessoi is not related to the ancient Asian game Go, despite how semantically satisfying it would be if one used stones from Go after one Went.

According to the BMJ piece, a Greek axiom about frugality cites the use of pessoi and their purpose: “Three stones are enough to wipe.” The modern equivalent is probably the purposefully self-contradictory “toilet paper doesn't grow on trees.”

Some pessoi may have originated as ostraca, pieces of broken ceramic on which the Greeks of old inscribed the names of enemies. The ostraca were used to vote for some pain-in-the-well-you-know to be thrown out of town—hence, “ostracized.” The creative employment of ostraca as pessoi allowed for “literally putting faecal matter on the name of hated individuals,” Charlier and company suggest. Ostraca have been found bearing the name of Socrates, which is not surprising considering they hemlocked him up and threw away the key. (Technically, he hemlocked himself, but we could spend hours in Socratic debate about who took ultimate responsibility.)

Putting shards of a hard substance, however polished, in one's delicate places has some obvious medical risks. “The abrasive characteristics of ceramic,” the authors write, “suggest that long term use of pessoi could have resulted in local irritation, skin or mucosal damage, or complications of external haemorrhoids.”

To quote Shakespeare, “There's a divinity that shapes our ends.” Sadly, for millennia the materials used to clean our divinely shaped ends were decidedly rough-hewn.

Suggestion Box / New Member Personal Message Limits
« on: March 08, 2014, 05:26:57 am »
Is there any way the mods can change this so that new members or members with low numbers of posts are able to receive more Personal Messages? 


Mothers may say they don't care whether they have a son or a daughter, but their breast milk says otherwise.

"Mothers are producing different biological recipes for sons and daughters," says Katie Hinde, an evolutionary biologist at Harvard University.

Studies in humans, monkeys and other mammals have found a variety of differences in both the content and the quantity of milk produced.

One common theme: baby boys often get milk that is richer in fat or protein — and thus energy — while baby girls often get more milk.

There are a lot of theories as to why this happens, says Hinde, who presented her findings at the American Association for the Advancement of Science's annual meeting.

Rhesus monkeys, for instance, tend to produce more calcium in the milk they feed to daughters who inherit social status from their mothers.

"It could be adaptive in that it allows mothers to give more milk to daughters which is going to accelerate their development  and allow them to begin reproducing at early ages," says Hinde.

Reasons unclear

Males don't need to reach sexual maturity as quickly as females because the only limit on how often they reproduce is how many females they can win over.

The females also nurse for longer than male monkeys, who spend more time playing off on their own and thus need more energetically dense milk.

It's not yet clear why human mothers produce such different milk for their babies, says Hinde.

There is evidence, however, that the stage is set while the baby is still in utero.

Hinde published a study last week that showed that the sex of the foetus influences the milk production of cows long after they are separated from their calves &emdash; typically within hours of the birth.

The study of 1.49 million cows found that, over the course of two 305 day lactation periods, they produced an average of 445 kilograms more milk when they had female calves than when they had males.

They also found no difference in the protein or fat content of the milk produced for heifers than for bulls.

Improving milk formula

Much remains to be understood about how breast milk impacts infant development in humans, says Hinde.

Knowing more could help improve the baby milk formulas sold to mothers who are unable or unwilling to nurse their infants, she says.

"While the food aspects of milk to some extent are replicated in formula, the immuno factors and medicine of milk are not and the hormonal signals are not," she says.

Getting a better understanding of how milk is personalised for specific infants will also help hospitals find better matches for breast milk donated to help nourish sick and premature infants in neo natal units, she adds.

General Discussion / soil, "grounding", nature.
« on: February 01, 2014, 09:15:10 pm »
The Lakota was a true naturist--lover of Nature. He loved the earth and all things of the earth, the attachment growing with age. The old people came literally to love the soil and they sat or reclined on the ground with a feeling of being close to a mothering power. It was good for the skin to touch the earth and the old people liked to remove their moccasins and walk with bare feet on the sacred earth. Their tipis were built upon the earth and their altars were made of earth. The birds that flew in the air came to rest upon the earth and it was the final abiding place of all things that lived and grew. The soil was soothing, strengthening, cleansing, and healing... That is why the old Indian still sits upon the earth instead of propping himself up and away from its life-giving forces. For him, to sit or lie upon the ground is to be able to think more deeply and to feel more keenly; he can see more clearly into the mysteries of life and come closer in kinship to other lives about him... Kinship with all creatures of the earth, sky and water was a real and active principle. For the animal and bird world there existed a brotherly feeling that kept the Lakota safe among them and so close did some of the Lakota’s come to their feathered and furred friends that in true brotherhood they spoke a common tongue. The old Lakota was wise. He knew that man's heart away from nature becomes hard; he knew that lack of respect for growing, living things soon led to the lack of respect for humans too. So he kept his youth close to its softening influence." Chief Luther Standing Bear

General Discussion / The lost tribe that is trying to save the world
« on: December 17, 2013, 12:25:27 am »
The lost tribe that is trying to save the world

More than 20 years ago, Alan Ereira made a film about an elusive Colombian people who changed how he saw the universe. A follow-up has an urgent warning for mankind

Off Topic / What are you currently reading, watching, learning?
« on: December 13, 2013, 01:29:03 am »
How to raise chickens on compost :)

How to make cooked dick soup.  I wouldn't do it but its interesting to see the lengths (no pun intended) that people go through to eat dick.

Health / Vision Improvement
« on: November 29, 2013, 12:43:47 am »
I have worn eye glasses since I was 12.  I  remember the first time looking out into my front yard and how precisely i could see the definition in the pansies my mom had planted along the driveway.  It was a miracle to see long distance, so it was definitely something I needed.  However, each year, during my eye exams, the Dr.'s would over correct my vision, purposefully to 20/15...(20/20 being normal).....perhaps they thought I would be able to see into the future but there was no real reason to do this.  And each year when I would go in for an exam my vision had worsened and my prescription was raised.  I remember them doing this and remember that for the first few weeks my vision was so sharp, but this would strain my eyes and only a panacea, not an improvement.

Three years ago I finally realized I could just ask the Dr. NOT to change my prescription.  There are some legalities about how well you have to be able to see to drive and such but I could be well within that limit with way less prescription.   So for three years I have had the same prescription, my vision definitely wasn't 20/20 but it also isn't horrible.  It was fine, I could see and read most everything near and far, road signs, etc...anyhow.

 I went to the eye doctor today to have an eye exam so that I could purchase new contact lenses and glasses.

He was very receptive to this, he said that he thinks that I am correct in assuming overcorrection is what lead to a weakness in my eyes.  He told me his normal procedure was to bring the vision to 20/20 and then to back it down to where the eyes weren't straining.  He also told me that we were to disregard my normal prescription until after he was done with the exam.

So 20 minutes later, after staring at the chart, reading and rereading letters through different lenses, he tells me my left eye, an eye that I'd injured when I was younger, has slightly better vision that my right and then writes down my prescription.  He then goes back out to the office to grab my old prescription and comes back with the new that my vision has actual improved, especially in my left eye! 

This is likely not to have happened if I would have continued to have my prescription changed 3 years ago.  I also attribute it to finally admitting that I really cant do carbs, and that my blood sugar has an incredible affect on my vision and that they just aren't that necessary.  Its not much and I still have horrible vision all around, but it gives me hope.

I really think that inundating my body with all the necessary building blocks in the best forms I can find, raw fats, proteins, vitamins and minerals  has a lot to do with my vision improving.  I plan to see this eye Dr more regularly, if I can afford it,  to catch any improvements and capitalize on the fact that i might be able to slowly back off my prescription.  Of course I will also continue to eat such an amazing, regenerative diet and that  will help my vision to continue to improve.

Off Topic / the Yeti..!
« on: October 20, 2013, 02:47:21 am »
 recent findings of an Oxford researcher that suggest the yeti is real (but a descendant of an ancient polar bear)? Perhaps scientists can truly determine this by hitting up Bhutan:

 I was looking for a source for a whole lamb on and noticed that this ranch was offering nationwide shipping.  I know some of you buy from out of state or in bulk so I thought I would post this here incase anyone was interested in sourcing bison meat and/or if this turned out to be a good deal.  I have no idea as I am lucky enough to get fresh/local meats, but I have tried their bison and it's definitely good quality.

General Discussion / mainstream "raw primal" website
« on: September 27, 2013, 07:37:03 am »

I was reading something about a WAPF conference with Sally Fallon where they served raw pate catered by the women whos website is linked above.  So there is a lot of mention of raw milk, which I think people are much easier to buy into since its delicious and addictive, but it does mention raw meat eating.

Pages: [1] 2 3
SMF spam blocked by CleanTalk