Subscribe

RSS Feed (xml)

Powered By

Skin Design:
Free Blogger Skins

Powered by Blogger

oracle search

Tuesday, October 7, 2008

What If Saturated Fat Is Not the Problem?

An article in dLife, written by Richard Feinman, PhD:

Here’s an idea to chew on: The carbs in your diet tell your body what to do with the fat you eat, so it’s the type and amount of carbohydrates that matter when it comes to your weight and health.

Virtually every bit of health information today includes the advice to avoid saturated fat — the so-called evil stuff that lurks in animal foods like steak and eggs. The basis for this recommendation is that research has shown a correlation between saturated fat intake and total cholesterol and LDL (“bad cholesterol”). The problem with these studies is that the effects are not large, there is wide variation among individuals and, in most of these studies, the predicted benefit in incidence of cardiovascular disease did not materialize. In addition, we now know much more about risk factors for cardiovascular disease (CVD) beyond LDL. No assessment of CVD risk can be made without considering HDL (“good cholesterol”), triglycerides, and the size of the LDL particle. Plenty of research shows that these markers can worsen when people reduce their intake of saturated fat and that they can improve by reducing the intake of carbohydrates.

You don’t have to be a medical researcher to recognize that this is a politically charged issue. The thing that is missing for the public is an impartial evaluation of all the data on saturated fat. My personal opinion is that there is much contradictory data and a recent review of the situation suggests that there is not sufficient evidence to make any recommendations.

There is a sense that, in the absence of definitive evidence, lowering saturated fat will at least do no harm. This is not right. The problem for people with diabetes is what happens when saturated fat is replaced with carbohydrate, and research has repeatedly shown that this may actually be harmful. Consider that, according to the Centers for Disease Control and Prevention, during the onset of the current epidemic of obesity and diabetes, almost all of the increase in calories in the American diet has been due to carbohydrate. The percent of total fat and saturated fat in our diet decreased. In men, the absolute amount of saturated fat consumed decreased by 14 percent!

One of the most striking reasons to doubt the across-the-board proscriptions against saturated fat is the report from the large scale Framingham study in the Journal of the American Medical Association, titled “Inverse association of dietary fat with development of ischemic stroke in men.” You read that right: The more saturated fat in the diet, the lower the incidence of stroke.

Perhaps the most compelling research was published in a 2004 issue of the American Journal of Clinical Nutrition by researchers from the Harvard School of Public Health. Their study showed that, in postmenopausal women with heart disease, a higher saturated fat intake was associated with less narrowing of the coronary artery and a reduced progression of disease. Even with similar levels of LDL cholesterol, women with lower saturated fat intake had much higher rates of disease progression. Higher saturated fat intake was also associated with higher HDL (the “good” cholesterol) and lower triglycerides.

If saturated fat isn’t the problem, what is?
In this study, in which greater saturated fat intake was associated with less progression of coronary atherosclerosis, carbohydrate intake was associated with a greater progression. Carbohydrate, through its effect on insulin, is the key player. Insulin not only sweeps up glucose from the blood but it also plays air traffic controller, making the call as to whether that glucose is turned into fat or is used for energy. Most importantly, insulin determines what happens to dietary fat — whether it gets stored or oxidized for fuel. In fact, the fat profile in the blood (cholesterol and triglycerides) is not strongly tied to diet.

A recent study by Jeff Volek at the University of Connecticut compared low-carbohydrate and low-fat diets. Even though the low-carbohydrate diet had three times as much saturated fat as the low-fat diet, levels of unhealthy fats in the blood were lower in the low-carbohydrate group. How is that possible? That is what metabolism does.

What is the best diet?
We don’t know the ideal diet composition. We do know that saturated fat, unlike trans-fat, is a normal part of body chemistry and extreme avoidance is not justified by current scientific data. Removing some saturated fat to reduce calories is good, but adding back carbs appears to be deleterious. It appears that healthy, carbohydrate restriction will trump the effects of any kind of fat. For a person with diabetes, blood glucose must be the first consideration. If you have relatively tight blood sugar control, the amount of saturated fat you eat may be a non-issue. You can do what we did before the diabetes-obesity epidemic: regulate your intake by your taste and your natural appetite. No one ever did want to eat a pound of bacon.

Sources:
1. Food and Nutrition Board: Macronutrients. In: Dietary reference intake: National Academies Press; 2005, p.484.

2. JB German, CJ Dillard: Saturated fats: what dietary intake? Am J Clin Nutr 2004, 80:550-559.

3. MW Gillman, et al. : Inverse association of dietary fat with development of ischemic stroke in men. JAMA 1997, 278:2145-150.

4. D Mozaffarian, EB Rimm, DM Herrington: Dietary fats, carbohydrate, and progression of coronary atherosclerosis in postmenopausal women. Am J Clin Nutr 2004, 80:1175-1184.

5. JS Volek, et al. A hypocaloric, very low carbohydrate, ketogenic diet results in a greater reduction in the percent and absolute amount of plasma triglyceride saturated fatty acids compared to a low fat diet. NAASO, Boston, MA, October, 2006.

ADA Revises Nutrition Recommendation for those with Diabetes

I'm still reading through the just released Nutrition Recommendations and Interventions for Diabetes - the 2008 position statement from the American Diabetes Association regarding dietary recommendations for those at risk for or diagnosed with diabetes.

While I finish reading the actual documents and write up what I think of the paper, here are links to what others are saying today:

Jimmy Moore - New 2008 ADA Recommendations Partially Acknowledge Low-Carb Diets

Adam Campbell - Apparently Hell Just Froze Over

Dr. Mary Vernon - HAS THE AMERICAN DIABETES ASSOCIATION SPARKED YET ANOTHER ATKINS REVOLUTION?

Low-Carb Ketogenic Diet - Greater Weight Loss and More...

A neat little study was published in the January 2008 American Journal of Clinical Nutrition - Effects of a high-protein ketogenic diet on hunger, appetite, and weight loss in obese men feeding ad libitum.

In the study, researchers followed seventeen obese men, confined to a metabolic ward, for a month while they consumed either a low-carb ketogenic diet or a moderate carbohydrate diet on an ad libitum basis (eat as much as desired from foods allowed).

Over the course of the study, the protein intake of both dietary regiments was fixed to provide 30% of calories; carbohydrate was restricted to 4% of total calories in the men consuming the low-carb diet and 22% in the men consuming the moderate carbohydrate diet; and fat intake rounded out the calories in each diet with no specific limitation on fat consumption in either.

All meals were prepared and provided as requested by the men and they were allowed to eat whatever they wanted of the allowed foods in each meal with no restrictions other than on their carbohydrate options. The men on the low-carb diet consumed less calories each day on their own and reported higher feelings of satiety while on the diet.

On average, the difference in carbohydrate intake was great - the men on the low-carb diet consumed just 22g of carbohydrate each day, while those on the moderate carbohydrate diet consumed 170g each day. Both levels of intake were significant reductions from baseline, where the men averaged 396g of carbohydrate each day.

Weight loss was greater for the men following the low-carb diet, who averaged a weight loss of 6.34kg (13.95-pounds) compared to the moderate carb diet averaging a loss of 4.35kg (9.57-pounds). Calorie differences between the two groups do not fully explain the greater weight loss in the men consuming the low-carb diet since they ate about 1731-calories a day compared to the men consuming the moderate carb diet consuming about 1898-calories a day. This difference - about 167-calories a day - translates to a month long difference of 5016-calories, or 1.43-pounds....yet the difference between the two groups was 4.38-pounds greater weight loss in those on the low-carb diet.

Digging deeper into the published data we also find that the men on the low-carb diet experienced statistically significant improvements in blood glucose, insulin and HOMA-IR, as well as favorable improvements in their cholesterol levels with a reduction in total cholesterol and LDL, an increase in HDL and a significant reduction in triglycerides.

All of these favorable changes occured while the men consumed a dietary fat intake similar to that at baseline. Where at baseline they consumed an average of 126g of total fat with 43.8g of saturated fat, their dietary intake while following low-carb didn't change much - they averaged 129g of total fat on the low-carb diet and 46.3g of saturated fat. This basically highlights that modifying one's diet to be low-carb does not mean one suddenly increases dietary fat consumption significantly - in this trial, dietary fat was pretty much the same compared to baseline.

So, with this study, we have one more to add to the pile that supports carbohydrate restriction for satiety, ad libitum-spontaneous calorie reduction, weight loss, improvements to glucose, insulin and insulin resistance, along with favorable improvements (although not statistically different from the moderate carbohydrate diet) to lipids.

Low-GI Diet for 1-year Suggests Improvements for T2DM?

As I was browsing through the January issue of the American Journal of Clinical Nutrition this weekend, a perspective written by John Miles caught my attention. In his article, A role for the glycemic index in preventing or treating diabetes, he wrote, "Elsewhere in this issue of the Journal, Wolever et al (7) report the results of the Canadian Trial of Carbohydrates in Diabetes (CCD). Patients with well-controlled type 2 diabetes who were treated with diet alone were randomly assigned to receive either a high-GI diet, a low-GI diet, or a low-carbohydrate, high-monounsaturated fat diet for 1 y.

The study was carefully conducted and of longer duration than many earlier trials. The investigators found no weight loss and a small increase in glycated hemoglobin (HbA1c) in all 3 groups. This increase in HbA1c is what one would expect with no intervention (8). The fact that glucose concentrations 2 h after an oral glucose challenge were significantly lower in persons who had followed the low-GI diet for 1 y than in those who followed the other 2 diets for 1 y suggests improvement in either insulin sensitivity or insulin secretion (or improvements in both)."

Sounds like a compelling study, so intrigued, I clicked open the study mentioned, The Canadian Trial of Carbohydrates in Diabetes (CCD), a 1-y controlled trial of low-glycemic-index dietary carbohydrate in type 2 diabetes: no effect on glycated hemoglobin but reduction in C-reactive protein, and read the abstract.

In it, researchers concluded "In subjects with T2DM managed by diet alone with optimal glycemic control, long-term HbA1c was not affected by altering the GI or the amount of dietary carbohydrate. Differences in total:HDL cholesterol among diets had disappeared by 6 mo. However, because of sustained reductions in postprandial glucose and CRP, a low-GI diet may be preferred for the dietary management of T2DM."

Here I was even more intrigued - a study trial comparing three different dietary approaches, and one low-carb for a year?

But I wondered, how was it that the low-carb diet didn't do as well as other studies would suggest it should have?

Wanting to know this, I opened the full-text to understand how it was possible that greater improvement was not found in the "low-carbohydrate" subjects and why HbA1c didn't remain stable or improve in the course of one year with either low-GI or low-carbohydrate diets. Previous study data published by others would suggest that HbA1c would at least remain stable with low-carb, no?

Well, it took no more than five minutes to fully see why things turned out as they did - the "low-carbohydrate" diet was not a low-carbohydrate diet afterall - at baseline the subjects assigned the low-carb diet ate an average of 210g of carbohydrate each day and during the low-carb diet consumed an average of 199g of carbohydrate each day.

Hello? In whose fantasy world is 199g of carbohydrate each day a low-carb diet?

Ah, but I digress...

While the researchers took pains to measure many risk factors, at the end of the year, the subjects in every group experienced progression of their diabetes risk factors - there simply was no improvement to laud in this trial, no matter how you twist the data.

What's absolutely disappointing in how the findings are presented is that the researchers honed in on two measures of improvement - CRP and post-prandial glucose - to the exclusion of significant declines in other measures that are critically important for those with type II diabetes.

Where do I even begin?

Weight remained fairly stable in all three groups, with only the low-GI group actually gaining some weight, despite no meaningful difference in calorie intake from baseline through one year on the low-GI diet.

Worse though is the lack of critical thought around the marked and significant increase in waist circumference in all three diet groups.

  • The high-GI group started with a waist circumference of 99.1cm - it increased over the year to 103.1cm (+1.6 inches) - this despite stable weight on the scale (84.4kg at baseline; 84.3kg after 1-year on the diet).
  • The low-GI group started with a waist circumference of 98.3cm - it increase over the course of the year to 104.9cm (+2.6 inches). They also experienced a weight gain, going from 81.1kg at baseline to 83.9kg at the end of the study (+6.2-pounds gained).
  • The supposed "low-carbohydrate" group - eating 199g carbohydrate on average - started with a waist circumference of 98.6cm - it increased over the study period to 103.1cm (+1.8-inches) - like the high-GI group, this increase was despite a stable scale weight...they started at 84.7kg and ended the year at 84.3kg.

If that wasn't bad enough - all three groups experienced increases in their HbA1c too.

Those consuming a high-GI diet saw a rise from 6.2 to 6.34; low-GI saw a rise from 6.2 to 6.34; and those on the supposed low-carb diet rose from 6.1 to 6.35.

Over time the researchers reported that this rise was statistically significant - and I'd say clinically significant too!

Now with just these risk measures, you'd think there was enough to maybe, just maybe, inspire the researchers to challenge the efficacy of any of the above diets for those with type II diabetes. Maybe even say that perhaps the level of carbohydrate - despite quality or glycemic index or load improvements - matters; that simply modifying the type of carbohydrate in the diet does NOTHING for glycemic control and if carbohydrate is consumed habitually at levels seen in this study, perhaps someone with type II diabetes should expect a continued progression of their disease?

But, ya know what? They didn't even consider that. And the above problems were not all they reported either.

Let's see what else was reported in the data:

Total cholesterol didn't do much in any of the groups; LDL didn't change significantly in any group....HDL however declined in the low-GI group, but rose in the high-GI and supposed low-carb groups.

Triglycerides fell slightly in the high-GI and supposed low-carb groups, but rose in the low-GI group.

Two more markers of potential health risks found to be problematic in the low-GI diet over a year - and the researchers even noted it in the full-text - "With the low-GI diet, mean triacylglycerol was 12% higher, HDL was 4% lower, and the ratio of apoB to apoA was 4% higher than with the low-CHO diet

But a 12% increase in triglycerides and a 4% drop in HDL didn't set off any alarm bells either.

Neither did the fact the low-GI group saw an increase in their fasting plasma glucose over the year, which was also noted and basically disregarded.

With regards to the main focus in the abstract, C-Reactive Protein, the researchers did find a significant difference between the low-GI diet and the high-GI diet, but also noted that between the low-GI diet and the supposed "low-carb" diet there was no significant difference.

Yet, they chose to focus on the low-GI diet as better for a type II diabetic, despite the fact it led to

  • weight gain
  • waist circumference increase
  • increase in HbA1c
  • increase in fasting plasma glucose
  • a marked rise in triglycerides
  • and a decline in HDL

The conclusion here speaks volumes when taken in context to the carbohydrate intake in each group, "In subjects with T2DM managed by diet alone with optimal glycemic control, long-term HbA1c was not affected by altering the GI or the amount of dietary carbohydrate."

Better stated might be, with no meaningful change in absolute carbohydrate consumption, even with improvement in quality of carbohydrate - it is likely a type II diabetic will experience progressive decline in glycemic control along with other declines in risk factors over a year.

The data is published right in the full-text - the glycemic index as a means to reverse or prevent diabetes is no solution.

In this study, those who followed the low-GI diet had the worst overall outcome - they gained weight, increased waist circumference, saw triglycerides rise while HDL fell, and experienced a decline in glycemic control as evidenced by their increased HbA1c.

Yet you wouldn't know it from the abstract which focuses instead on "sustained reductions in postprandial glucose and CRP" and then concludes that "a low-GI diet may be preferred for the dietary management of T2DM."

And then back to the article from John Miles, who said this study "suggests improvements" in those who followed the low-GI diet for one year. Who's he kidding?

ADA Says Low-Carb Okay for Weight Loss, So What?

An article published today in Diabetes Health, ADA Now Supports Low-Carb Diets, reminded me that I have not yet posted my thoughts on the updated guidelines for Medical Nutrition Therapy (MNT) recently issued by the American Diabetes Association (ADA). This will probably be longer than usual, so bear with me!

On December 28, 2007, the ADA issued a press release to highlight the publication of and changes within their clinical practice recommendations, better known as the Standards of Care in Diabetes.

Each year the guidelines are updated and this year was no exception - as noted in the press release, "Until now, the ADA did not recommend low carbohydrate diets because of lack of sufficient scientific evidence supporting their safety and effectiveness. The 2008 Recommendations include a statement recognizing the increasing evidence that weight-loss plans that restrict carbohydrate or fat calorie intake are equally effective for reducing weight in the short term (up to one year). The "Standards of Medical Care in Diabetes--2008" document reviews the growing evidence for the effectiveness of either approach to weight loss. In addition, there is now evidence that the most important determinant of weight loss is not the composition of the diet, but whether the person can stick with it, and that some individuals are more likely to adhere to a low carbohydrate diet while others may find a low fat calorie-restricted diet easier to follow."

There are two main issues I'll look at in the above statement today:

1. that low-carbohydrate diets are as effective as low-fat calorie restricted diets for weight loss for up to one-year

2. that composition of diet is less important than whether a person can stick with the dietary approach for weight loss.

In order to fully understand exactly what the ADA is saying with regard to the first issue, we need to return to the August 2006 issue of Diabetes Care, where an updated consensus statement was published, Management of Hyperglycemia in Type 2 Diabetes: A Consensus Algorithm for the Initiation and Adjustment of Therapy: A consensus statement from the American Diabetes Association [ADA] and the European Association for the Study of Diabetes [EASD]

As I noted in my review of that consensus statement "Rather than question the dietary recommendations, or explore emerging data supportive of dietary interventions that are different from the recommendations, the statement instead concludes that "the limited long-term success of lifestyle programs to maintain glycemic goals in patients with type 2 diabetes suggests that a large majority of patients will require the addition of medications over the course of their diabetes."

The final sentence in the section discussing medications, which followed the section on lifestyle intervention, sets the stage for what is to come, "addition of medications is the rule, not the exception, if treatment goals are to be met over time."

In August 2006 the ADA, along with the EASD, threw up their hands and decided that dietary and lifestyle intervention was futile, therefore the only logical place to go was intensive pharmaceutical intervention at diagnosis.

The authors wrote, in the paper's conclusion, "We now understand that much of the morbidity associated with long-term complications can be substantially reduced with interventions that achieve glucose levels close to the nondiabetic range. Although new classes of medications, and numerous combinations, have been demonstrated to lower glycemia, current-day management has failed to achieve and maintain the glycemic levels most likely to provide optimal health care status for people with diabetes."

On this the ADA remains steadfast - pharmacological intervention is the first step with lifestyle intervention upon diagnosis. The freely available full-text of the Management of Hyperglycemia in Type 2 Diabetes: A Consensus Algorithm for the Initiation and Adjustment of Therapy clearly continues with the August 2006 consensus that lifestyle intervention for those diagnosed with diabetes will not work, therefore medication must be initiated upon diagnosis.

Yet we find the ADA falling all over itself to tout its position change for weight loss - that now a low-carbohydrate diet is considered as effective as a low-fat calorie restricted diet for weight loss? And somehow we're supposed to be jumping for joy that they made this change?

If we take the entire package of documents published in the Diabetes Care Supplement, we cannot reach any conclusion other than the ADA has made up its mind and is not going to review the evidence. They may concede that a low-carbohydrate diet can help with some loss of weight, but nothing else - and even that carries the caveat that one must be intensely monitored if they do decide to follow a low-carb diet.

But, going back to the first point - the concession that low-carbohydrate diets and low-fat calorie restricted diets are both effective for weight loss over the short-term.

Quite frankly this statement by the ADA is meaningless when we consider the full context of their position because they hold that "current-day management has failed to achieve and maintain the glycemic levels most likely to provide optimal health care status for people with diabetes."

While many lauding the change as a step in the right direction for the ADA, I'm not impressed, nor convinced - if anything, the ADA only confirmed what they've already said previously.

We only need to go back to the publication of a 22-month study, in which diabetic subjects were found to have significant health improvements following a low-carbohydrate diet, to read the ADA reaction in an article at WebMD - "While agreeing that carbohydrate restriction helps people with type 2 diabetes control their blood sugar, ADA spokesman Nathaniel G. Clark, MD, tells WebMD that the ADA does not recommend very low-carb diets because patients find them too restrictive. "We want to promote a diet that people can live with long-term," says Clark, who is vice president of clinical affairs and youth strategies for the ADA. "People who go on very low carbohydrate diets generally aren't able to stick with them for long periods of time."

Which brings us to issue two above - diet composition does not matter as much as a diet one can follow, a theme the ADA has been hot and heavy on for at least two years now.

Let's review the sentence in the ADA press release carefully, "In addition, there is now evidence that the most important determinant of weight loss is not the composition of the diet, but whether the person can stick with it, and that some individuals are more likely to adhere to a low carbohydrate diet while others may find a low fat calorie-restricted diet easier to follow."

Evidence? What evidence?

The Standards of Medical Care in Diabetes 2008 includes this sentence, "Although numerous studies have attempted to identify the optimal mix of macronutrients for meal plans of people with diabetes, it is unlikely that one such combination of macronutrients exists. The best mix of carbohydrate, protein, and fat appears to vary depending on individual circumstances. For those individuals seeking guidance on macronutrient distribution in healthy adults, the Dietary Reference Intakes (DRIs) may be helpful;" referencing the IOM documents published back in 2002.

The Nutrition Recommendations and Interventions for Diabetes: A Position Statement of the American Diabetes Association 2008 includes "Nutrition counseling should be sensitive to the personal needs, willingness to change, and ability to make changes of the individual with pre-diabetes or diabetes. (E)"

Note with that the letter "E" assigned to it, classing it as "expert opinion" - so again, where is the evidence? Each study referenced dates between 1997 and 2004 - so what exactly is the new evidence alluded to in the updated documents?

Oh, that's right, there is NONE...this is simply an opinion and has already been stated numerous times before.

I've said it before, "evidence versus sophistry; with just enough opinion thrown in to ensure glycemic control remains elusive..."

The ADA refuses to acknowledge that diabetics deserve clear statements about how to achieve normal blood sugars, and instead continues headlong on this path that they somehow deserve to eat like anyone else in the population and can mediate the effects of carbohydrate-rich food with medications.

:::sigh:::

So yeah, the ADA now says one can try a low-carbohydrate diet for weight loss, for the short-term (up to 1 year) and that some will somehow manage to follow such a diet. But let's not forget, if you do decide to follow a low-carbohydrate diet, you're also going to be subjected to much more intense monitoring than your low-fat calorie restricted peer and you're left with no advice other than the same-old same-old once your year is up.

Then what?

The failed ADA diet?

Lifelong medication with continued stepped-up pharmaceutical requirements with each passing year until you're dependent upon insulin injections?

The ADA, even with this new position that a low-carbohydrate diet may be used for up to one year for weight loss, still continues to fail in their mission - to prevent and cure diabetes and to improve the lives of all people affected by diabetes - because they refuse to actually review the hard data available; and instead continue in this sophistry that dietary recommendations need to be based upon what one wants to eat rather than what one should eat based upon metabolic, hormonal and physiological facts.

Animal products are ‘whole foods,' too

Dana Carpender's latest - I couldn't have said it better myself!

Animal products are 'whole foods,' too
Dana Carpender

The nutritional buzz phrase is 'whole foods.' This is encouraging. I've been watching the nutrition scene long enough to remember when people who insisted that whole-grain bread was more nutritious than enriched bread were scorned as 'food faddists.'

But the admonitions to eat whole foods seem to apply only to grains, fruits and vegetables. Officialdom still recommends discarding large fractions of animal foods. Yet few see these fractionated animal foods as the refined, depleted foods they are.

Take dairy. Virtually all recommendations for dairy products include the qualifiers 'low-fat' or 'fat-free.' But that's not the way it comes out of the cow. Yes, whole milk has more calories than skim. It also has far more vitamin A, because it's carried in the butterfat. (Some skim milk is fortified with vitamin A —- the equivalent of adding a few vitamins back to nutritionally depleted white flour.) Because fat aids in calcium absorption, you'll get more calcium from whole milk. Whole milk from grass-fed cows supplies CLA, a fat that increases fat-burning and reduces heart disease and cancer risk, and omega-3 fats, which reduce inflammation, and heart disease and cancer risk. It is worth paying premium prices for such milk.

And eggs. Oh, poor eggs. There they are, just about the most perfect food in the world, and what do people do? They throw away the yolks. The part with almost all the vitamins, including A, E, K and the hard-to-come-by D, not to mention brain-enhancing choline and DHA. Eggs from pastured chickens also have yolks rich in omega-3. Better to throw away the whites, not that I'd recommend that, either. Just eat whole eggs, will you?

Then there's chicken. When did 'chicken' become synonymous with 'boneless, skinless chicken breast?' Chicken breast is a good food, but the whole chicken is better. Dark and white meats both have nutritional strengths. They are not identical in vitamin and mineral content. Chicken skin is a good source of vitamin A, again because it's fatty. I wrote recently about liver's nutritional bonanza, and hearts are nutrient-rich as well, making giblet gravy a great idea. Simmering the leftover chicken bones yields flavorsome broth rich in highly absorbable calcium and joint-building gelatin. (I save my steak bones, too, for beef broth.)

Our ancestors, ever mindful of where their next meal was coming from, relished every edible part of every animal they killed. Indeed, paleoanthropologists assert that hunter-gatherers ate the rich, fatty organ meats first, preferring them to muscle meats, and smashed bones to eat the marrow. As recently as a century ago, marrow was such a popular food that special spoons were made for scooping it out of bones. I love the stuff. I've been sucking the marrow out of lamb-chop bones since I was a tyke. A 1997 article in the journal Nature asserts that human brain capacity decreased at the dawn of agriculture 10,000 years ago, very likely because of a reduction in animal-fat consumption. Whole animal foods are part of our nutritional heritage.

My low-carbohydrate eating habits are often referred to as a 'fad.' Whatever. If it was good enough for my hunter-gatherer ancestors, it's good enough for me. Do you want to know what's really a fad? Removing the fat from milk and the yolks from eggs, and discarding three- quarters of the chicken, all organ meats and most bones. There's not a culture in the world where our narrow, refined, low-fat, flavorless versions of animal foods are part of the traditional diet.

Continuing reading for recipe included in original article!

Two-Fold Reduction in Triglycerides! How? Low-Carb!

In a recent study - Metabolic Effects of Weight Loss on a Very-Low-Carbohydrate Diet Compared With an Isocaloric High-Carbohydrate Diet in Abdominally Obese Subjects, published in the Journal of the American College of Cardiology, researchers reported favorable results for obese adults randomly assigned a low-carbohydrate diet for 24-weeks (six months).

As reported in heartWire, "After six months, isocaloric energy-restricted very-low-carbohydrate, high-fat and high-carbohydrate, low-fat diets produced similar weight loss and substantial reductions in a number of cardiovascular disease risk markers," write Jeannie Tay (Flinders University, Adelaide, Australia) and colleagues in the January 1, 2008 issue of the Journal of the American College of Cardiology.

"Neither diet displayed adverse effects, suggesting diverse dietary patterns, including very-low-carbohydrate, high-fat diets, may be tailored to an individual's metabolic profile and dietary preference for weight management."

The investigators note that while the traditional diet reduced LDL-cholesterol levels, the low-carbohydrate, high-fat diet resulted in greater increases in HDL cholesterol and singificantly larger reductions in triacylglycerol levels (a two-fold greater reduction compared to the traditional low-fat diet).

In fact, if we go to the full-text, we find the researchers went so far as to write "consistent with other recent studies, the VLCHF (very-low-carb high-fat) diet produced greater reductions in TAG and increases in HDL-C than the HCLF (high-carbohydrate low-fat) diet. This suggests that the VLCHF diet as a weight loss strategy may confer the greatest clinical benefits in patients who present with hypertriglyceridemia, low HDL levels, abdominal adiposity, and insulin resistance"

Did you catch that - "greatest clinical benefits" - part?

In the paper we learn that subjects were randomly assigned to either of the moderately energy-restricted diet plans for 24 weeks. For those assigned to the very-low-carbohydrate, high-fat diet, 4% of total calories were obtained from carbohydrates, 35% from protein, and 61% from fat, including 20% of total calories from saturated fat. Subjects randomized to the high-carbohydrate, low-fat diet followed a more traditional macronutrient profile, with 46% of calories obtained from carbohydrates and 30% from fat, including <8% from saturated fat.

Now we all know the American Heart Association insists we must keep saturated fat at less than 7% of our calories because intakes higher than that will kill us (eye roll) - yet here we have stunning improvements with saturated fat intake at/above 20% daily for six months! If you haven't read it yet, Dr. Richard Feinman wrote a good article about saturated fat recently.

So, what gives? It seems the subjects in this study significantly reduced their carbohydrate, and that carbohydrate does matter. Also, this study confirms previous data published that found similar improvements in those restricting carbohydrate.

Slowly but surely more data is coming forward that validates carbohydrate restriction for not only weight loss, but health improvements!

Insulin Resistance and Cardiomyopathy

An interesting abstract was published in the recent issue of the Journal of the American College of Cardiology - Insulin-Resistant Cardiomyopathy, Clinical Evidence, Mechanisms, and Treatment Options.

Increasing evidence points to insulin resistance as a primary etiologic factor in the development of nonischemic heart failure (HF). The myocardium normally responds to injury by altering substrate metabolism to increase energy efficiency. Insulin resistance prevents this adaptive response and can lead to further injury by contributing to lipotoxicity, sympathetic up-regulation, inflammation, oxidative stress, and fibrosis.

Animal models have repeatedly demonstrated the existence of an insulin-resistant cardiomyopathy, one that is characterized by inefficient energy metabolism and is reversible by improving energy use. Clinical studies in humans strongly support the link between insulin resistance and nonischemic HF.

Insulin resistance is highly prevalent in the nonischemic HF population, predates the development of HF, independently defines a worse prognosis, and predicts response to antiadrenergic therapy.

Potential options for treatment include metabolic-modulating agents and antidiabetic drugs. This article reviews the basic science evidence, animal experiments, and human clinical data supporting the existence of an "insulin-resistant cardiomyopathy" and proposes specific potential therapeutic approaches.

-------------------------

Protein Provides Satiety Through PYY

In our strange world, we have researchers now promoting the idea that a pharmaceutical version of the gut hormone PYY may offer a solution to help individuals lose weight.

In the MSN article, Natural Gut Hormones May Provide a Treatment for Obesity, we learn that researchers are seeking to develop a pill to provide the satiety hormone PYY.

"The advantage of developing weight loss medications based on gut-derived satiety hormones is that they enhance a process that occurs naturally. It is expected, therefore, that side effects will be minimal," says Dr Sainsbury-Salis.

Folks, we're not PYY deficient; in fact, I'd argue we're not eating the foods that stimulate PYY to effectively sate appetite naturally.

As I noted in a previous blog post about research investigating PYY, "A high protein diet led to spontaneous calorie reduction as PYY increased. The phenomenon was consistent with both the animal model using mice and in human studies used to validate the mice model. Over a longer term, the higher protein diet stimulated weight loss and enhanced PYY synthesis and secretion in mice."

As I noted in that post, the study I wrote about included quite specific detail about how diet influences the release of PYY in humans - "The ready availability of carbohydrate-rich grains and cereals has been a recent development in human nutrition with the onset of organized agriculture. Many of the physiological systems that regulate food intake were probably established and may function better under lower-carbohydrate and higher-protein dietary conditions."

Those were not my words, but the words of the researchers!

And now we have researchers looking to design a pill to provide what we already have naturally - if we eat adequate protein and fat. But, let's not go there and discuss diet, let's just pop a pill and continue along with the supposed "healthy diet" that obviously is not sating out appetite!

The future of Oracle APEX - aka Oracle APEX 4.0

In case you haven't read it on Marc Sewtz blog or in the newest edition of the German Oracle APEX community newsletter, on Friday July 25th at 14:00 (German time) there will be a WebCast about the features the Oracle APEX team is currently working on for Oracle APEX 4.0! The WebCast will also cover some tips & tricks for APEX 3.1.1.

Get the details about how to join the WebCast at the German Oracle APEX Community web site.

Overlapping labels in an Oracle APEX pie chart

Ever had the problem of overlapping labels in a pie chart which you used in an Oracle Application Express (APEX) application? Have a look at Gary Myers excellent tip to avoid collisions by using analytic functions to sort the data. It's another great example of using the built-in power of the Oracle database.

Two new Oracle APEX whitepapers

Ok, they are not brand new anymore, they have already been released last month. But in case you are not reading David Peake's blog or checking regularly the Oracle APEX website on OTN, the APEX team has published the following two new whitepapers:

  • NTLM Authentication (a PL/SQL only solution) and
  • Oracle APEX with RAC (Real Application Cluster)

Oracle APEX team needs your help!

It's your chance to help the Oracle APEX team to spreading the word about Oracle APEX. They want to convince the Oracle eBusiness Suite/Oracle Applications team to formally legitimize the use of Oracle Application Express with the Oracle eBusiness Suite/Applications. I think that would be another huge step to get Oracle APEX into more companies if it can be officially used to write custom applications in that environment.

But they need your help! Check out David Peake's blog posting for more details.

What's next on the Oracle APEX roadmap? Oracle APEX 3.2!

Just read on David Peake's blog that he has updated the Statement of Direction for Oracle Application Express (APEX).

What has changed?

They are planing to release Oracle APEX 3.2, which will be Oracle APEX 3.1 + The Oracle Forms Migration Tool. If you are a Forms guru, they are looking for beta testers! Get the details at David's blog.

I think the Forms Migration Tool will be another major step for Oracle APEX to get the Forms developers into the APEX boat. These two tools have a lot of similarities (declarative, PL/SQL, ...) and the same kind of productivity to develop database driven applications. Forms developers can re-use their existing skills and companies protect there investment by using the same business logic (if written in database packages) as for there Forms applications. Compared to learning or migrating an application into a new language (eg. Java/.Net) this can be a huge time safer.

Oracle APEX 3.1.2 is out!

The Oracle APEX team has released a new patchset for Oracle Application Express (APEX). According to the readme file the new version 3.1.2 contains 28 new bug fixes (Note: the list also contains the bugs from the previous patchset).

The full release can be downloaded from the OTN download side.
The patchset with just the changed files can be found on Metalink with the patch number 7313609.

Number of displayed Records of Oracle APEX Popup LOV

Recently I answered a question about where to change the number of displayed records of a Popup LOV in Oracle APEX and I thought it could be of general interest.

The behavior of Popup LOVs can be changed through the Popup Lov template. You can find it at
Shared Components\Templates\Popup LOV.

In the Pagination section you will find the property Display which defines how many records are displayed at once.

The Popup LOV template also contains other interesting settings like the icon which is used for the popup lov field or the text for the search button, the width and height which is used to display the popup lov window and much more. Just have a look!

DOAG Regionaltreffen in München zum Thema Oracle APEX

Just a quick announcement about an Oracle APEX meeting of the German Oracle Usergroup in Munich. The rest of the posting will be in German.

Termin: Mittwoch, 17.09.2008, 17:00 - 19:00 Uhr

Geplante Vorträge:

1. Professionelle APEX Entwicklung - Projektbeispiele und Lessons Learned
Referent: Dietmar Aust, OPAL Consulting

2. Möglichkeiten mit Oracle TEXT (Volltextsuche) und APEX
Referent: Carsten Czarski, ORACLE Deutschland GmbH

Ab ca. 19:00 Uhr wird der Meinungsaustausch dann im Löwenbräukeller am Stiglmeierplatz fortgesetzt.

Weitere
Details und die Anmeldung auf den Seiten der DOAG.

Übrigens, die Teilname ist kostenlos.

Pro Oracle Application Express finally shipping?

Is John Scott's long awaited book Pro Oracle Application Express finally shipping?

It looks like!

There are reports on the OTN APEX forum that they have received a mail from Amazon that they have shipped the book. I also checked the order page of Amazon and it shows that the book is on stock.

You will not guess where I'm! ;-)

On the other side it's not very hard to guess. I'm there where most of the Oracle blogging and Oracle APEX community currently is. I'm at Oracle Open World in San Francisco.

I'm not sure how often I will blog, because I have a tight time schedule. But if you want to get hold of me, I will try to visit most of the Oracle APEX sessions.

On Tuesday, between 14:30 and 15:30 I'm doing my own presentation titled The Power of the Oracle APEX Repository (Session Id: 300210). It's currently booked out, but you might get a chance to still get in.

Enough for now, I have to listen to Tom Kyte's "Efficient Schema Design" now.

Tuesday, August 12, 2008

Bracketing frenzy ....oracle

I just found myself decyphering this:

SELECT ...
FROM report_results rpt
WHERE NOT ( (substr(rpt.report_type,2,1) in ('1','2','3','4','5')) and
(rpt.sig_type = 'ISDNMA') and
((rpt.group_ctn != rpt.ctn_prefix||rpt.ctn_suffix) and
(length(rpt.group_ctn) != length(rpt.ctn_prefix||rpt.ctn_suffix)) and
((length(rpt.group_ctn) != (length(rpt.ctn_prefix||rpt.ctn_suffix)-1)
))));

Which turns out to mean this:

SELECT ...
FROM report_results rpt
WHERE NOT ( SUBSTR(rpt.report_type,2,1) IN ('1','2','3','4','5')
AND rpt.sig_type = 'ISDNMA'
AND rpt.group_ctn != rpt.ctn_prefix||rpt.ctn_suffix
AND LENGTH(rpt.group_ctn) != LENGTH(rpt.ctn_prefix||rpt.ctn_suffix)
AND LENGTH(rpt.group_ctn) != LENGTH(rpt.ctn_prefix||rpt.ctn_suffix)-1 );

No wonder there are no brackets left in the shops.

EAV nightmare ....oracle

My charitable Christmas mood only goes so far. I'm looking through a spec which reads like a "database design nightmare!" theme advent calendar. Each page reveals a potential disaster more frightening than the one before. This is my favourite new year hang over inducing cocktail of entity attribute values and generic application design all wrapped up in a gloriously mal-specified mess.

Table: Parameters

id VARCHAR2(50) PRIMARY KEY --The application requesting the value
identifier NUMBER PRIMARY KEY --The name of the parameter
type CHAR(1) --‘I’, ‘S’ or ‘B’ is type of value for the parameter
string_val VARCHAR2(50) --‘Y’ or ‘N’ or NULL if BOOLEAN or the string
integer_val NUMBER -- The integer value

note: The id holds a value comprising the IP address, type of
application and the instance at that IP.

Who needs modular code? ....oracle

Wouldn’t it be nice if when people wrote some useful code, they tried to make it suitably modular and reusable? This is what I have to contend with at the moment. We have a large, complex system written in Oracle Forms that we are now partially re-writing in HTMLDB (hurrah!) One function I want to replicate is the ability to change your own Oracle password; the current Forms application has a form to do this that looks like this:

Old Password: [ ] New Password: [ ] Confirm New Password: [ ]
So that should be a moment’s work to redo, right? Wrong. There is a package of procedures for user maintenance that contains the following 2 procedures that are relevant:
PROCEDURE change_password_validate
(p_username IN VARCHAR2,
p_old_password IN VARCHAR2,
p_new_password IN VARCHAR2,
p_confirm_password IN VARCHAR2,
p_profile_name IN VARCHAR2,
p_mode IN VARCHAR2 DEFAULT 'N',
p_mask IN VARCHAR2) ;

PROCEDURE change_password_process
(p_username IN VARCHAR2,
p_new_password IN VARCHAR2,
p_confirm_password IN VARCHAR2,
p_mask IN VARCHAR2,
p_profile_name IN VARCHAR2,
p_encrypt_pw IN VARCHAR2,
p_admin_mode IN VARCHAR2 DEFAULT NULL) ;
Some design flaws are immediately evident:
  • Validation is totally separate from processing. If I choose to, I can skip the validate routine altogether and call the process routine to change the password to anything I like, regardless of whether I get the old password right or confirm it correctly. (Actually, this foolish separation of validation from processing is a company standard!)
  • I get to choose whether the password is to be stored (in our own application’s USERS table) in encrypted form or not. HTF do I know whether it should be encrypted or not?
  • I need to supply something called p_mask, which I think may be something to do with the encryption process, or maybe the validation process – none of this is documented of course, or at least nobody knows where any such documentation may be found. I have tried passing the word ‘mask’ and it seems to work, except that all subsequent attempt to change the password then fail on the validation of the “old” password – perhaps because it has been encrypted in an unexpected manner.
  • I also need to supply something called p_profile_name, which I do happen to know is a user attribute something like a role, stored in the USERS table. Well excuse me, but if I’m passing in the username as a parameter, why should I have to go look up the USERS record and obtain the profile_name value just to pass it into this lazy procedure?
  • I don’t fully understand p_mode and p_admin_mode either, but at least they have defaults which I assume (for now) I can live with.
So instead of being a 5 minute job, this is probably going to occupy about a day of my time: locating source code (the packages are wrapped in the database), studying source code to see what it is doing. And if all that fails, trying to find someone from the team that wrote the code and ask them to tell me what I should be doing.

Bring out your WTFs ....oracle

Always on the lookout for blog material requiring minimum editorial effort, we welcome your WTFs. Amusing and instructive examples of mind-boggling Oracle-related madness (ideally short ones) can now be sent to us at our new e-mail address: OracleWTF@bigfoot.com. I would also like to welcome our two new WTF contributors, Tony Andrews and Scott Swank.

Grow Your Own Concurrency Problem ....oracle

What's that? the sound of ORA-00001: approaching...

...
FUNCTION key_not_in_table(pkey IN INT) RETURN BOOLEAN
IS
countkey INT;
BEGIN
SELECT count(key) INTO countkey
FROM key_values WHERE key = pkey;

IF countkey > 0 THEN
RETURN FALSE;
END IF;
RETURN TRUE;

END key_not_in_table;

PROCEDURE insert_or_update(pkey IN INT,
pval IN INT)
IS
BEGIN
IF key_not_in_table(pkey) THEN
INSERT INTO key_values
VALUES (key, value, 0);
ELSE
UPDATE key_values
SET value = pval
WHERE key = pkey;
END IF;
END insert_or_update;

Universal SQL Performance Improver Discovered ....oracle

In an AskTom thread this week, the poster wrote:

"...I have been told before by several people, and I have implemented myself on several SQLs that adding the clause "AND 1=1" literally to any SQL statement helps improve the performance of the SQL statement dramatically."

And we've all been wasting our time looking for a FAST=TRUE parameter.

Bring out your WTFs ....oracle

Always on the lookout for blog material requiring minimal editorial effort, we welcome your WTF submissions. Amusing and instructive examples of mind-boggling Oracle-related madness (ideally short ones) can now be sent to us at our new e-mail address: oraclewtf@bigfoot.com. Please remember to include "OracleWTF" in your Subject line.

I would also like to welcome our two new contributors, Tony Andrews and Scott Swank.

Friday, August 8, 2008

It's One More, Innit? ....oracle

Thanks to Scott Lynch for submitting an example of how a J2EE application developer just might not trust the database to do its job.

Over to Scott...

From a big bucks retail management system (now owned by a big bucks DBMS vendor).

1. Get NextVal from the sequence.

2. Assign the value, an integer, to a string.

3. Check to see if the string they just created exists.

4. Cast the integer that has been cast to a string, to a BigDecimal.

5. Add 1 to it (because they're obviously smarter than some silly old sequence).

I just love step 3.

Sheer brilliance on that one. And it's repeated for almost every table in this particular little slice of the application.

------------------------------------------------------------------------------------------------

public long getNextId() throws java.sql.SQLException{
if (conn == null)
{
throw new java.sql.SQLException("Connection not set");
}
long nextIdLong = 0;
try
{
//Create a statement
tStmt = conn.createStatement();

//Create a query string to get all the fields from the table. The
//presentation layer will decide which field to display
String query = "SELECT some_seq.nextval FROM dual";

//The complete query is executed
rs = tStmt.executeQuery(query);
rs.next();
String nextIdString = rs.getString(1);

if (nextIdString != null) {
nextIdLong = ((new BigDecimal(nextIdString)).add(new BigDecimal(1))).longValue();
}

tStmt.close();

} catch (SQLException e)
{
throw new java.sql.SQLException(e.toString());
}
return nextIdLong;
}

Two days before the day after tomorrow ....oracle

Clearly a South Park fan worked here once:

cat oracle_GetThisworkingDay

DATE=`date +%Y%m%d%H%M`
CUTOFF=$2

#!/bin/ksh
# oracle_GetThisworkingDay
# Script to retrieve the current working day (YYYYMMDD) from
# the working_calendar table in the Oracle database.
oracle_GetPreviousWorkingDay `oracle_GetNextWorkingDay $DATE $CUTOFF`

I'll spare you the contents of these scripts. Suffice to say they call the following procedures:
   FUNCTION previous_day (
p_date DATE DEFAULT SYSDATE
)
RETURN VARCHAR2
IS
v_result VARCHAR2 (10);
BEGIN
SELECT dt
INTO v_result
FROM working_calendar
WHERE dt = (SELECT MAX (dt)
FROM working_calendar
WHERE dt < p_date );
RETURN TO_CHAR(v_result,'YYYYMMDD');
END;
( nice use of SQL there ) and of course...
   FUNCTION next_day (
p_now DATE DEFAULT CURRENT_DATE
)
RETURN DATE
IS
RESULT DATE;
BEGIN
SELECT MIN (dt)
INTO RESULT
FROM working_calendar
WHERE dt >= p_now + 1;

RETURN RESULT;
END;

When is a BLOB not a blob? ....oracle

When it's a Bee-Lob, apparently. If think you know how to pronounce some of the more common Oracle-related words, you have to check Eddie Awad's post, "Char or Car", and the follow-up comments...

awads.net/wp/2006/01/18/char-or-car

INTEGER Type is Platform-Independent shock ....oracle

We are grateful to oracleplsqlprogramming.com for their December 2005 Tip of the Month: Insights into PL/SQL Integers, in which we learn this:

INTEGER - defined in the STANDARD package as a subtype of NUMBER, this datatype is implemented in a completely platform-independent fashion, which means that anything you do with NUMBER or INTEGER variables should work the same regardless of the hardware on which the database is installed.

And thank goodness for that, is what we say. Sometimes you just don't need platform-dependent results from your PL/SQL integer calculation depending on the hardware on which the database is installed.

I Object, Your Honour... ....oracle

Erm, excuse me for interrupting, but what exactly is this?

There are 2 ways to construct an ANYDATA. The CONVERT* calls enable construction of the ANYDATA in its entirety with a single call. They serve as explicit CAST functions from any type in the Oracle ORDBMS to ANYDATA.
(Found here in the 10.1 documentation).

Wednesday, August 6, 2008

Counting Sheep ....oracle

I'd not seen this Oracle Forums thread before, though it started in 2002 and now has 198 replies. Somebody once asked for some PL/SQL coding standards, someone else offered to email some, and then for ever after gets bombarded with requests from other people saying "Please send same to me at another-idiot-sheep@nobrain.com" Every now and then someone kindly posts a URL to some PL/SQL standards on the web, or points out that these people are just getting their email addresses onto a lot of spam mailing lists, but on and on they go asking for a copy to be sent direct to them. It's surprisingly funny.

EAV Returns: The Concrete Elephant approach ....oracle

Anyone who has read Tales Of The Oak Table, not to mention Tony Andrews' blog or any of the countless articles and discussions on the subject on AskTom and elsewhere, will know two things about the fabled "Entity-Attribute-Value" approach to database design, in which you model all "things" in one table with a "thing ID" and a "thing type", plus a second table holding one row per "attribute", and thus create an application that can model any conceivable type of thing, ever:

  1. It seems like a clever idea at first.
  2. It isn't.

But wait. A poster on OTN forums ("SIMPLE Database Design Problem") has solved the major problems inherent in the original Entity-Attribute-Value approach, by simply denormalising away the Attribute-Value part.

Now the ENTITIES table will have all the columns for every entity type. Maybe a lot of them will be null because "INVOICE" rows will use mainly different columns to "TROPICAL_DISEASE" rows, but disk space is cheap, and look at the simplification we have achieved by not having to babysit all those old-fashioned tables. And it's not a generic design any more, is it? It's concrete.

The table would look something like this:

ENTITYID ENTITYTYPE NAME      PRICE DIET  COLOUR ANNUAL_TURNOVER
-------- ---------- --------- ----- ----- ------ ---------------
1 PERSON William
2 FRUIT Banana Yellow
3 COMPANY Megacorp 100000000
4 ANIMAL Fruitbat Fruit
5 SNACK Snickers 0.4

accompanied by a generic RELATIONS table like this:

ENTITY1 ENTITY2 RELATIONSHIP
------- ------- ------------
3 1 EMPLOYS
1 2 EATS
1 5 EATS

Want to list the snacks eaten by Megacorp employees? Simple:

SELECT emp.entityid, emp.name, snack.name, snack.price
FROM entities emp
JOIN relations emprel
ON emprel.entity2 = emp.entityid
AND emprel.relationship = 'EMPLOYS'

JOIN entities com
ON com.entityid = emprel.entity1
AND com.entitytype = 'COMPANY'

JOIN relations snrel
ON snrel.entity1 = emp.entityid
AND snrel.relationship = 'EATS'

JOIN entities snack
ON snack.entityid = snrel.entity2
AND snack.entitytype = 'SNACK'

WHERE emp.entitytype = 'PERSON'
AND com.name = 'Megacorp';

Want to make FRUITBAT an employee of SNICKERS?

INSERT INTO relations VALUES (5, 4, 'EMPLOYS');

The thread becomes increasingly surreal as more and more posters suggest likely issues, from performance (he's prototyped it and the slowdown is insignificant) to complexity (the code will be generated dynamically from an object library) and the limited number of columns per table in Oracle (he might go with MySQL) while Erdem remains cheerfully confident that it will work (it won't).

My thanks to 3360 for sharing this. Send your WTFs to us at OracleWTF@bigfoot.com.

The Phantom's Gonna Git Ya ....oracle

I know I'm asking for trouble here by offering an AskTom page for a WTF, but I couldn't resist. If there was ever a time you wanted your spelling to be spot on, it would be when posting a link to a spell-checker...

Wanna Date? ....oracle

Dates are known to be exceedingly difficult and avoiding them at all costs is something of a skill. This function, in the spirit of Never do in SQL what you can do in PL/SQL, calculates a date range before calling another procedure that also avoided using dates for its input date parameters. So despite its absence, it is at least partially responsible for this mess.

I stripped the code down to its date handling which studiously avoids using date calculations wherever possible, and uses string handling instead, leaving in the comments because they are also the documentation.

create or replace function start_date (
p_range in varchar2,
p_in_date in varchar2 -- DD-MON-YYYY format String
)
return varchar2
as
l_out_date varchar2(11);
month varchar2(10) := to_char(to_date(p_in_date,'DD-MON-YYYY'),'MON');
year varchar2(10) := to_char(to_date(p_in_date,'DD-MON-YYYY'),'YYYY');
v_cnt_yr number;
v_end_date date := to_date(p_in_date,'DD-MON-YYYY');
v_start_date varchar2(11);
begin
if p_range = 'QTD' THEN

if month in ('JAN','FEB','MAR') then
-- if given month = march and date is 31
-- then data for jan,feb and march.
l_out_date := '01-JAN-'||year;
elsif month in ('APR','MAY','JUN') then
l_out_date := '01-APR-'||year;
elsif month in ('JUL','AUG','SEP') then
l_out_date := '01-JUL-'||year;
elsif month in ('OCT','NOV','DEC') then
l_out_date := '01-OCT-'||year;
end if;

elsif p_range = 'YTD' then

-- beginning of the year.
l_out_date := '01-JAN-'||year;

elsif p_range = 'M' or p_range='MTD' then

-- beginning of month
l_out_date := '01-'||month||'-'||year;

elsif p_range like 'B%' then
v_cnt_yr := substr(ltrim(rtrim(p_range)),2);
-- We take the start date as the first day after trailing
-- back the required no. of months
l_out_date := to_char(last_day(add_months(
last_day(v_end_date), -v_cnt_yr)) + 1,
'DD-MON-YYYY');
end if;
return l_out_date;
end;

After reading it I thought, "So what does this do that TRUNC doesn't?" Apparently not a lot when you need to get the month, quarter or year to date. If you try this at home remember to format the return of START_DATE for readability since it usefully returns a 4000 character string.

SQL> exec :d := '17-JUN-2006'

PL/SQL procedure successfully completed.

SQL> select start_date('QTD',:d) start_date,
2 trunc(to_date(:d),'Q') from dual;

START_DATE TRUNC(TO_DA
----------- -----------
01-APR-2006 01-APR-2006

SQL> select start_date('YTD',:d) start_date,
2 trunc(to_date(:d),'Y') from dual;

START_DATE TRUNC(TO_DA
----------- -----------
01-JAN-2006 01-JAN-2006

SQL> select start_date('MTD',:d) start_date,
2 trunc(to_date(:d),'MM') from dual;

START_DATE TRUNC(TO_DA
----------- -----------
01-JUN-2006 01-JUN-2006

But what about the mysterious 'B%' format mask? This calculates the first day of the month, where 'Bn' is n-1 months ago, tricky eh? In SQL we are forced to call two functions instead of having START_DATE call ADD_MONTHS for us with two bonus LAST_DAYS thrown in for good measure. The n-1 bit could even be a bug but who knows?

SQL> select start_date('B12',:d) start_date,
2 trunc(add_months(to_date(:d),-11),'MM') from dual;

START_DATE TRUNC(ADD_M
----------- -----------
01-JUL-2005 01-JUL-2005

The convenience obviously outweighs the problem of having to deal with an undocumented date function that accepts and returns strings. Also with this function I have the luxury of substituting 'M' for 'MTD', but not 'Y' for 'YTD' or 'Q' for 'QTD' though I suspect these could be improvements for versions 2.0 and 3.0.

Stop Press: Oracle Granted License To Extend February ....oracle

Yes, it's official. Oracle has been granted permission to extend February by 3 days. Shame no-one told the developers responsible for INTERVAL arithmetic.

SQL> SELECT DATE '2006-01-31' + INTERVAL '1' MONTH
2 FROM dual;
SELECT DATE '2006-01-31' + INTERVAL '1' MONTH
*
ERROR at line 1:
ORA-01839: date not valid for month specified

Hey, we do the WTFs... ....oracle

I found this beauty on Connor McDonald's web site. It's perfect fodder for Oracle WTF and Connor is happy for us to include it here.

Tip: add DISTINCT to every query ....oracle

We all know that SQL can be a harsh mistress, and relational theorists such as Chris Date have long argued that the language is fundamentally flawed and that vendors have been misapplying relational theory from the start.

Now a Perl developer on perlmonks.org has been reading Date's book and finds that it explains all of his frustrations with databases. One tip for addressing their shortcomings is to add the handy DISTINCT keyword to every single query, because stupid old SQL doesn't automatically apply the degree of uniqueness you might have in mind:

In fact, one of the founders of relational theory, C.J. Date, recommends that every SQL SELECT statement be written as SELECT DISTINCT ... Unfortunately, many folks get upset with this for a variety of reasons. First, DBMS do a horrible job of optimizing DISTINCT queries. There are many cases where the DISTINCT can be safely ignored but in practice, using DISTINCT with large sets of data and complicated queries can have an abysmal impact on performance. I think this can be put into the "premature optimization" category, though. Until we know that DISTINCT is causing an impact, isn't it better to improve the chance of getting correct results and worry about performance afterwards?

I had to read that a couple of times to realise that adding a DISTINCT to every single query in the hope that it might mask some unknown deficiency in your model, your query, or the SQL language itself is not the "premature optimization" being referred to here - he means the idea that doing so might affect performance. After all, it might not, right?

Read the rest of the discussion at perlmonks.org: "Why SQL Sucks (with a little Perl to fix it)". It also appears on Database Debunkings, "On the sins of SQL".

Hey, we do the WTFs Part II... ....oracle

At this rate, we might need to make Connor an honorary member. Here's another gem from his web site that he's picked up on his travels. PL/SQL doesn't get much "better" than this...

Umm, I forgot my password ....oracle

Probem solved, with the following convenient password reset procedure found in a large production database, with EXECUTE granted to PUBLIC and a handy public synonym:

CREATE OR REPLACE PROCEDURE reset_user_password(p_username IN VARCHAR2)
AS
BEGIN
execute immediate 'ALTER USER '||upper(p_username)||' IDENTIFIED BY '||upper(p_username);
END;
/
Many thanks to Robert De Laat for this submission.

Error-Prone Error Handling ....oracle

A colleague found this handy utility on the internet. The idea is that whenever you get an Oracle error, the errant SQL statement will be written away to a table along with the message for you to read and enjoy at your leisure. Which would be fine I suppose, if it didn't introduce a whole lot of errors of its own.

create table caught_errors (
dt date,
username varchar2( 30), -- value from ora_login_user
msg varchar2(512),
stmt varchar2(512)
);

create or replace trigger catch_errors
after servererror on database
declare
sql_text ora_name_list_t;
msg_ varchar2(2000) := null;
stmt_ varchar2(2000) := null;
begin

for depth in 1 .. ora_server_error_depth loop
msg_ := msg_ || ora_server_error_msg(depth);
end loop;

for i in 1 .. ora_sql_txt(sql_text) loop
stmt_ := stmt_ || sql_text(i);
end loop;

insert into
caught_errors (dt , username ,msg ,stmt )
values (sysdate, ora_login_user,msg_,stmt_);
end;
/

Note that whenever a SQL error occurs on the database this trigger will fire and:

1) try to stuff the entire SQL statement that failed into a varchar2(2000), regardless of how big it actually is

2) if that doesn't blow up, then tries to insert the same value into a varchar2(255) column

In SQL Plus, this leads to errors like this (using an invalid table name in a large SELECT statement):

ERROR at line 30:
ORA-00604: error occurred at recursive SQL level 1
ORA-01401: inserted value too large for column
ORA-06512: at line 12
ORA-00942: table or view does not exist

... which is clearly more informative than:

ERROR at line 30:
ORA-00942: table or view does not exist

Create Your Own DUAL Table ....oracle

If you want to retrieve a sequence value into a PL/SQL variable, you have to SELECT FROM DUAL. (Or use RETURNING INTO of course, but never mind that now.)

According to some, this is not only an inconvenient restriction, but also prone to failure if SYS.DUAL contains more than one row.

Help is at hand in the form of the utility below, which solves both problems at once by installing a table, a public synonym, a trigger and a function. Now your application will never again be unable to retrieve sequence values directly into PL/SQL variables on days when DUAL contains more than one row. So that's one less thing to worry about.

CREATE OR REPLACE PROCEDURE replace_onerow (
table_name_in IN VARCHAR2
)
IS
BEGIN
BEGIN
EXECUTE IMMEDIATE 'DROP TABLE ' || table_name_in;
EXCEPTION
WHEN OTHERS THEN NULL;
END;

EXECUTE IMMEDIATE 'CREATE TABLE '
|| table_name_in
|| ' (dummy VARCHAR2(1))';

EXECUTE IMMEDIATE
'CREATE OR REPLACE TRIGGER onerow_' || table_name_in ||
' BEFORE INSERT
ON ' || table_name_in || '
DECLARE
PRAGMA AUTONOMOUS_TRANSACTION;
l_count PLS_INTEGER;
BEGIN
SELECT COUNT (*)
INTO l_count
FROM ' || table_name_in || ';

IF l_count = 1
THEN
raise_application_error
( -20000
, ''The ' || table_name_in || ' table can only have one row.'' );
END IF;
END;';

EXECUTE IMMEDIATE 'BEGIN INSERT INTO '
|| table_name_in
|| ' VALUES (''X''); COMMIT; END;';

EXECUTE IMMEDIATE 'GRANT SELECT ON '
|| table_name_in
|| ' TO PUBLIC';

EXECUTE IMMEDIATE 'CREATE PUBLIC SYNONYM '
|| table_name_in
|| ' FOR '
|| table_name_in;

EXECUTE IMMEDIATE
'CREATE OR REPLACE FUNCTION next_pky (seq_in IN VARCHAR2)
RETURN PLS_INTEGER AUTHID CURRENT_USER
IS
retval PLS_INTEGER;
BEGIN
EXECUTE IMMEDIATE ''SELECT '' || seq_in
|| ''.NEXTVAL FROM ' || table_name_in ||
'|| ''INTO retval;
RETURN retval;
END next_pky;';

END replace_onerow;

What is it with dates? ....oracle

What exactly is it with dates that so many Oracle developers struggle with? Why do they go to such lengths to avoid using the DATE type? Why, if DATE validation or arithmetic is required, would they use CHARs or NUMBERs? Abuses of DATEs seems to be a recurring theme on Oracle WTF. Indeed, here's another good example kindly provided by Graham Oakes.

Over to Graham...


This is a cracker, the easy way to check the supplied date in a string (after all who actually wants to use date types) is a valid date.

IF to_number(substr(v_valuedate,3,2)) NOT BETWEEN 1 AND 12
THEN
v_rowstatustype := -190;
ELSE
-- check 31 day months
IF substr(v_valuedate,3,2) IN ('01','03','05','07','08','10','12')
THEN
IF to_number(substr(v_valuedate,1,2)) > 31
THEN
v_rowstatustype := -200;
END IF;

-- check 30 day months
ELSIF substr(v_valuedate,3,2) IN ('04','06','09','11')
THEN
IF to_number(substr(v_valuedate,1,2)) > 30
THEN
v_rowstatustype := -200;
END IF;

-- check leap year feb
ELSIF substr(v_valuedate,3,2) = '02'
AND MOD(to_number(substr(v_valuedate,5,4)),4) = 0
THEN
IF to_number(substr(v_valuedate,1,2)) > 29
THEN
v_rowstatustype := -200;
END IF;

-- check non-leap year feb
ELSIF substr(v_valuedate,3,2) = '02'
AND MOD(to_number(substr(v_valuedate,5,4)),4) != 0
THEN
IF to_number(substr(v_valuedate,1,2)) > 28
THEN
v_rowstatustype := -200;
END IF;
END IF;
END IF;

Umm, I forgot my password, Part 2 ....oracle

In a thread on OTN forums, a poster asked how he could recover a user's password. Naturally he was told that it can't be done because the password itself is not stored, only a hash based on the username and password combination.

After some interesting discussion of password hashing, brute force and rainbow table attacks and the like, a poster makes the following rather novel suggestion:

if you apply the password verify function, yes it is possible to get the password of a user.

Etape 1: edit the utlpwdmg.sql script and add the line which is in bold (insert into...)

-- Check if the password is same as the username

IF NLS_LOWER(password) = NLS_LOWER(username) THEN
raise_application_error(-20001, 'Password same as or similar to user');
END IF;

insert into mytable values ('username','password');

-- Check for the minimum length of the password

Etape 2: run this script as sys

Etape 3: grant the profile to user whom u want to get the password.

u will be able to get the new password by consulting the table mytable (u must create this table)

This had me puzzled at first, and I had to check what $ORACLE_HOME/rdbms/admin/utlpwdmg.sql did. In fact it creates a default password verification function called "verify_function" ("verify_password" might have made a better name, but that's obfuscation for you), and then assigns it to the default profile using ALTER PROFILE DEFAULT ... PASSWORD_VERIFY_FUNCTION verify_function;

This means that any attempt to change the password for a user with the default profile (see ALTER USER examples in the documentation), will automatically execute verify_function(username, password, old_password). The idea is to apply some rules to prevent easily guessed passwords such as your username, but Mouhamadou's ingenious addition is his extra line,

insert into mytable values (username,password);

Now any attempt to change the password for a user with the default profile that successfully passes this extra security step will result in the new password being logged in mytable in clear text.

As we like to say on Oracle WTF, problem solved.

Many thanks to Andrew P. Clarke for submitting this.

Simplest row generator meets maximum inefficiency ....oracle

Generating a set of values that are not stored in a table is a reasonably common problem. I recently came across this solution for a set of irregular interval values.

 OPEN cur_rows FOR
'SELECT 3 FROM dual UNION '||
'SELECT 6 FROM dual UNION '||
'SELECT 9 FROM dual UNION '||
'SELECT 12 FROM dual UNION '||
'SELECT 24 FROM dual UNION '||
'SELECT 36 FROM dual';
LOOP
FETCH cur_rows INTO v_num_value;
EXIT WHEN cur_rows%NOTFOUND;
I didn't know which to admire the most. The clever use of dynamic SQL or the sort distinct to get rid of any duplicate literals that may occur from accidentally typing the same line twice. I think the latter wins for being also relationally pure apparently.

If only there was a SUM function ....oracle

...then we wouldn't have to write code like this, which, as Graham Oakes can confirm, takes ages:

FOR r IN ( SELECT tid FROM t_brel WHERE bqid = iqid )
LOOP
SELECT q.lamount, q.famount
INTO v_lamount, v_famt
FROM t_aq atq
, t_q q
WHERE atq.tid = r.tid
AND q.qid = atq.qid
AND qtype = 10;

v_ltotal := v_ltotal + v_lamount;
v_ftotal := v_ftotal + v_famt;
END LOOP;

UPDATE t_q
SET lamount = v_ltotal
, famount = v_ftotal
WHERE qid = iqid;

We can but dream.

While we're on the subject, we received this from a correspondent who wishes to remain anonymous:

FOR r IN
(
SELECT /*+ FIRST_ROWS */
*
FROM pay_details
WHERE acct_fk = p_accountpk
)
LOOP
DELETE pay_details
WHERE primarykey = r.primarykey;

COMMIT;
END LOOP;

I particularly like the FIRST_ROWS hint. "What, it's slow? Better make sure it uses that index..."

The Decibel Method ....oracle

A poster on the hardcore comp.databases.oracle.server newsgroup had some tables in the production database that he felt were redundant, and asked, not unreasonably I felt, whether there was a way to tell from his Developer 6i application whether they were in fact in use:

hello

we are using oracle 9i production database and d2k 6i applications. is there any way by which i can find the tables/ columns that are not in used by applications. So that i can move them out from our production database.

can anyone throw some light how to do it.

After the obligatory initial responses along the lines of "it can't be done" and "you are an idiot for even thinking about it" (you take your life in your hands when you post on cdos), one respondent begins,

For tables, the solution is called "auditing". You can audit desired objects.

Reasonable enough. Give it a year, then if nobody has used the tables, they are probably not so important. Then he continues:

For columns, you should be using so called decibel method: if you suspect that column C1 in table TAB is not used, you can always execute the following commands:

update TAB set C1=NULL;
commit;

If the reaction to that is a loud scream, accompanied by swearwords and a genuine cornucopia of various expletives, you've made a mistake, the column was used. It's time for the "I'm sorry, I didn't know that this column was still being used" routine. You can rest assured that this swearing sucker is gonna be busy for a while.

If not, you can proceed and drop the column. The previous update has an added benefit of making "drop column" operation faster. It will also expose weak points in all those lousy applications that use "select *" and expect the table to populate all of their variables.

An alternative to the decibel method is fine-grain auditing, described in the books by D. Knox. It's much more tedious and requires much larger knowledge then the decibel method, which is also a lot of fun.

Problem solved I think.

Our thanks to Herod T for the plug.

ORA-06553: PLS-906: Compilation is not possible ....oracle

Or in this case posted on the OTN Forums, is it even desirable. It's nice to see there is the ability for the database to simply say, "I'm sorry but this code is too stupid to run".

The Clue Was In The Name ....oracle

email from tester: 
This bug is blocking test: bg9876-a, I have elevated it to priority 1.
reply: 
The bug you've raised says "please provide a version of write_transaction() stored procedure, that doesn't write anything to the database", can you explain where you're going with this?
email from tester: 
It is a question of practicality for us, we have used this method in every release, we need to test high throughput rates without a Performance Oracle server.

A Thneed's a Fine-Something-That-All-People-Need! ....oracle

Now maybe I'm misusing this whole forum with this posting, and if so I apologise profusely, but...

www.janus-software.com/fb_fyracle.html

Are you worried about "under-licencing" too...?

Not only does open source Firebird-Fyracle have a catchy name and almost match Oracle on the 6 well known "gold standard key database evaluation mapping criteria", but it is "just as idiosyncratic", AND you can run "Compiere v2.5.3c" ( almost )!

Who could possibly argue with the rigorously documented "proof of the pudding" that this product is "faster than Oracle on the same hardware"

SP2-0552: Bind variable not declared ....oracle

Please help. Why does SQL*Plus keep giving me the error

SP2-0552: Bind variable "INTARRAY" not declared.

Thanks in advance.

DECLARE
BatchSize : constant := 50;

subtype IndexRanges is INTEGER range 1 .. BatchSize;
type IntArrays is array( IndexRanges) of INTEGER;
IntArray : IntArrays;

EXEC SQL DECLARE network_cursor CURSOR FOR
select pal.provider_id from hold_provider_address_link pal;

EXEC SQL OPEN network_cursor;

-- establish a local block, with an exception to
-- handle the "no data found" condition
begin
EXEC SQL WHENEVER NOT FOUND raise NO_MORE_DATA;
FETCH_LOOP:
loop -- fetch the data, 20 rows at a time
EXEC SQL FETCH network_cursor
INTO :IntArray;

for I in 1..20 loop
-- process batches of 20 rows
...
end loop;

commit;
end loop FETCH_LOOP;

exception
-- the exception NO_MORE_DATA is raised when there is
-- no more data to FETCH
when NO_MORE_DATA =>
PUT("No more data to fetch. N of rows fetched was ");
PUT(ORACLE.SQLROWS);
NEW_LINE;

-- turn off the error handling
EXEC SQL WHENEVER NOT FOUND CONTINUE;
end;
/

I mentioned this in a comment the other day so apologies if you've seen it before, but I felt it really deserved its own post.

(Hint: SP errors are from SQL*Plus, and INTARRAY is a bind variable from SQL*Plus's point of view because it begins with ":")

By the way I don't mean to laugh too much at the poor guy who posted this problem on a forum, as from his other posts he seems to have had a crappy application dumped on him without much support from anyone at his company. I did ask him what language it was written in but he hasn't replied. Suggestions, anyone? (My guess is Pro*Ada, which in my opinion we don't see enough of these days.)

Oracle World Goes Sensible shock ....oracle

You may be wondering WTF happened to all the WTFs, and so are we. It seems the Oracle world has been going through a rather depressing sensible season recently, in which nobody posts Pro*Ada code and wonders why it won't run at the SQL*Plus prompt, or suggests adding "AND 1=1" to any query to make it go faster. Even Mr Feuerstein seems to have deserted us.

Perhaps Mike Ault's foray into international economics cheered some of us up. He proposed an ingenious regulatory system requiring (I think) US grain, medicine and other key exporters to adjust their prices in response to international oil market fluctuations, on the grounds that certain oil producers are not taking America seriously enough. Now that'll teach Johnny Foreigner a lesson. "It is time for America to get tough", he adds. Oh dear.

On a more conventional note, a Mr Sahil Malik complained at great length on cdos about how hard it was to install 9i Personal Edition on his PC. At one point he fumed:

Larry Ellison IMHO has only one business idea - "Defeat bill gates and trap every programmer in matrix like pods powering oracle databases". WHAT THE HECK !! Time he matured up a bit.

I agree. Larry Ellison, if you're listening, you need to mature up a bit and forget the whole Matrix pod idea. We all know that ends badly.

Over on the OTN SQL Developer Forum, we were intrigued when one frustrated poster asked:

I cannot find clustered option for the indexes or PKs that I create in SQL Developer.

Where is the CLUSTERED check box in user interface?

Where indeed? Helpfully, Sharon from the SQL Developer team promised to get one added ASAP:

I have added an enhancement request to get this added to the interface in a future release.

We look forward to seeing what that does.

Code generation nonpareil ....oracle

We've all engaged in code generation of one sort or another. For certain problems it's just the right, or perhaps the only tool. But then your content management vendor starts to throw this sort of thing at your database...

OPEN p_cursor FOR
'SELECT content_id, content_status
FROM content
WHERE content_type_id = 319
AND content_id IN (
51055 , 45531 , 42208 , 42911 , 46494 , 52898 , 44262 , 44312 , 47474 , 42792 ,
45956 , 45109 , 53432 , 14936 , 29040 , 28779 , 53015 , 48366 , 53739 , 48565 ,
47188 , 46573 , 43038 , 53534 , 51999 , 49731 , 52847 , 43883 , 41522 , 50804 ,
49975 , 45729 , 53260 , 47658 , 41325 , 49454 , 41374 , 45328 , 51612 , 54347 ,
50092 , 48147 , 42416 , 42570 , 49533 , 41948 , 51740 , 52973 , 42648 , 44867 ,
48289 , 45943 , 49556 , 54550 , 46801 , 43628 , 40569 , 41576 , 46752 , 44982 ,
42309 , 45146 , 47198 , 44993 , 47768 , 47060 , 46889 , 45651 , 47045 , 45830 ,
41248 , 54370 , 43741 , 44183 , 28451 , 45094 , 54332 , 47030 , 42060 , 41293 ,
48287 , 48012 , 47740 , 45688 , 43639 , 48484 , 47583 , 45304 , 51478 , 42633 ,
40558 , 43793 , 41587 , 49407 , 28803 , 43272 , 46464 , 45602 , 43866 , 44521 ,
41200 , 48044 , 46927 , 29186 , 45774 , 43722 , 45128 , 43398 , 47397 , 41670 ,
51888 , 47534 , 29237 , 42486 , 53811 , 44704 , 46618 , 48994 , 44848 , 44573 ,
52956 , 44487 , 42435 , 48164 , 43451 , 52031 , 51300 , 52595 , 53141 , 44032 ,
50904 , 41477 , 42161 , 44622 , 52695 , 43838 , 44562 , 45373 , 44882 , 47247 ,
42367 , 50921 , 46265 , 41933 , 48960 , 43143 , 43345 , 28412 , 48868 , 49005 ,
43533 , 46953 , 52420 , 44017 , 49817 , 41490 , 41395 , 43027 , 47703 , 29064 ,
41689 , 50564 , 44112 , 28957 , 29544 , 45223 , 43252 , 49699 , 47356 , 43565 ,
53604 , 45478 , 29422 , 53041 , 45697 , 44054 , 42469 , 44035 , 48750 , 46667 ,
50060 , 46698 , 48306 , 45849 , 45563 , 42452 , 45143 , 46110 , 47800 , 43076 ,
41280 , 45862 , 41657 , 42114 , 47523 , 51841 , 45988 , 48023 , 47307 , 43203 ,
53464 , 44202 , 47984 , 46218 , 45255 , 51412 , 52409 , 47442 , 45932 , 28269 ,
43823 , 46835 , 47785 , 50968 , 46050 , 41916 , 47219 , 51772 , 41980 , 43377 ,
43681 , 50007 , 43072 , 47150 , 45900 , 53376 , 54338 , 45360 , 49086 , 46520 ,
47626 , 29529 , 50321 , 51982 , 53660 , 51644 , 42028 , 41901 , 53269 , 47995 ,
43754 , 49163 , 46069 , 51213 , 53566 , 52304 , 49588 , 51134 , 45167 , 46447 ,
42108 , 45570 , 29439 , 46091 , 44160 , 52227 , 47324 , 42256 , 51262 , 28854 ,
43220 , 41363 , 45272 , 42097 , 46790 , 51102 , 50338 , 44149 , 49849 , 49375 ,
41814 , 44738 , 54211 , 52712 , 54364 , 50283 , 54442 , 46863 , 48687 , 45178 ,
28464 , 44084 , 47615 , 44504 , 50895 , 46231 , 45723 , 47753 , 47230 , 29600 ,
43430 , 28764 , 41619 , 44217 , 43649 , 47502 , 41509 , 47425 , 42873 , 51814 ,
41438 , 44925 , 15364 , 41539 , 50336 , 42777 , 42127 , 44967 , 46942 , 43578 ,
46644 , 48887 , 52748 , 46904 , 42290 , 44000 , 45446 , 45967 , 46539 , 44361 ,
52990 , 45341 , 42326 , 47952 , 49746 , 29512 , 45806 , 50252 , 48121 , 46981 ,
29162 , 41702 , 48700 , 50692 , 49039 , 44689 , 51181 , 42988 , 41737 , 41782 ,
54283 , 44796 , 42667 , 48793 , 42597 , 49195 , 46018 , 48070 , 42975 , 49264 ,
43675 , 29478 , 51000 , 43287 , 49052 , 51589 , 42506 , 44301 , 44649 , 44908 ,
48072 , 50857 , 46020 , 44067 , 42013 , 42091 , 43464 , 48578 , 54015 , 41412 ,
51361 , 52509 , 47124 , 51316 , 44143 , 29724 , 52714 , 44638 , 46142 , 41459 ,
42586 , 43919 , 48945 , 44346 , 49390 , 44822 , 46631 , 43107 , 43591 , 42144 ,
29755 , 43662 , 47540 , 43808 , 45390 , 42223 , 42371 , 42551 , 47918 , 47337 ,

For brevity's sake I'll skip the next 11,000 elements of the IN list.

If in doubt, test, test and test again ....oracle

Thanks to Rob Baillie for the following example...


I was glancing through some legacy code today, and came across this.

It's funny how barnacles can accumulate when code changes over time.

            if r2.status_id = 3 then
v_gp := r2.rate;
elsif r2.status_id in (11, 12) then
if r2.type_id = 3 then
v_gp := r2.rate;
else
v_gp := r2.rate;
end if;
else
v_gp := r2.rate;
end if;

Skip over the record being called r2 and work out what it actually does...

Dynstatic SQL... ....oracle

Here's Connor again...

Note that the following code has been "anonymised" to protect the guilty.


procedure P is
begin
...
...
execute immediate 'drop table T';
execute immediate 'create table T as select * from ......';
...
...
...
for i in ( select * from T ) loop
...
...
end loop;
end;

Ah, a mix of dynamic and static references... Now how precisely did that compile? Nope, I'm not sure either.

Why Machines Will Never Take Over The World ....oracle

It is a recurring theme in science fiction that as computers increase in power and sophistication, they may one day reach a point where they decide they can do better without us, and condemn us to lives suspended in racked pods with our brains plugged into a vast virtual world, or alternatively send increasingly resourceful and indestructable robots to hunt us down in our bunkers, while above ground survivors fight a desperate war for survival amidst the wreckage of civilisation.

If you find this vision of the future alarming, take heart in this SQL query, which was generated by a machine. Not, perhaps, a Cyberdyne Systems T-800 or the Matrix Mainframe, but something called OLAP API. We don't know what that is either, but we can tell you that if any robot descended from it ever attempts to enslave humanity using a virtual world and an unfeasible pod system, rest assured that you will have time to get out of its way.

That Reminds Me... ....oracle

Sometimes the solutions people come up with to a given problem are breathtaking in their combination of ingenuity and insanity. Take this view, which exists purely to transform a boring "reminder number" like 3, 4, 5, ... into the English text "Third Reminder", "Fourth Reminder", "Fifth Reminder", ...

CREATE OR REPLACE VIEW reminders_view AS
SELECT reminder_id
, reminder_seq
, ( SELECT
DECODE (reminder_seq,
1, 'No reminder',
2, 'Reminded',
3, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
4, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
5, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
6, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
7, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
8, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
9, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
10, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
11, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
12, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
13, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
14, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
15, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
16, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
17, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
18, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
19, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
20, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
21, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
22, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
23, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
24, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
25, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
26, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
27, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
28, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
29, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
30, INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder',
'Reminded') FROM DUAL) reminder_type
FROM reminders;

What a beauty! It works as follows:

1) Use a DECODE to determine what the number is. Note that 2 is treated as a special case, even though it ends up the same as all the "other" numbers above 30.

2) For numbers between 3 and 30:

2.1) Convert to a string

2.2) Concatenate with '-MAY-2004'

2.3) Convert to a date

2.4) Convert back to a string using the 'ddspth' format mask

2.5) Convert to Init Caps

2.6) Append the word ' Reminder'

3) Enclose the whole lot in a redundant "(SELECT ... FROM DUAL)" scalar subquery.

Obviously, this saved the developer from the tiresome task of typing 'Third', 'Fourth', 'Fifth' etc. And presumably he/she isn't aware of the CASE expression that could have reduced it to:

CREATE OR REPLACE VIEW reminders_view AS
SELECT reminder_id
, reminder_seq
, CASE WHEN reminder_seq = 1
THEN 'No reminder'
WHEN reminder_seq BETWEEN 3 AND 30
THEN INITCAP(TO_CHAR(TO_DATE(TO_CHAR(reminder_seq)||'-MAY-2004','DD-MON-YYYY'),'ddspth'))||' Reminder'
ELSE 'Reminded'
END reminder_type
FROM reminders;

(And why stop at 30 when May has 31 days?!)

Well, it gave me a much needed laugh on a Friday afternoon, anyway.

All names have been changed to protect the guilty.

Pause for thought ....oracle

Requirement:

Can I let a procedure wait for a specific time (10 seconds) before inserting values into a table?

One solution (according to a poster on OTN - don't try this at home):

declare
dStart DATE := SYSDATE;
nDiff NUMBER;
begin
dbms_output.put_line('dStart: '||TO_CHAR(dStart, 'dd.mm.yyyy hh24:mi:ss'));
LOOP
nDiff := (SYSDATE - dStart)*86400;
EXIT WHEN nDiff >= 10;
END LOOP;
dbms_output.put_line('nDiff: '||nDiff);
dbms_output.put_line('END: '||TO_CHAR(SYSDATE, 'dd.mm.yyyy hh24:mi:ss'));
end;
/

Or if you prefer,

create or replace function timeout
return number
is
begin
return to_number(to_char(sysdate,'SSSSS'));
end timeout;
/

Function created.

create or replace procedure tcal( t in number ) as
a number:=1;
st number:=to_number(to_char(sysdate,'SSSSS'));
x varchar2(12);
y varchar2(12);
begin
while(a<99999999) loop
dbms_output.enable(100000); ---> Just to make loop busy...
a := a + 1;
exit when (timeout - st)>=t;
end loop;

x:=to_char(trunc(sysdate)+st/(24*60*60),'HH:MI:SS AM');

dbms_output.put_line(' Started: '||x);

y:=to_char(trunc(sysdate)+timeout/(24*60*60),'HH:MI:SS AM');

dbms_output.put_line(' Ended: '||y);
dbms_output.put_line(timeout-st||' seconds reached...');
end ;
/

Procedure created.

Now that is all well and good, you may say, after all the cpu isn't doing anything else at the moment and these days they rarely catch fire, but couldn't we just use the supplied procedure DBMS_LOCK.SLEEP? Well apparently it is impractical. Those with a quiet afternoon to spare might like to follow the reasons why, at forums.oracle.com/forums/thread.jspa?threadID=402345.