Monthly Archives: May 2010

After-life episode? It’s in your mind

Patients who have had a neardeath experience often report walking towards a bright light, or a feeling that they are floating above their body— a sensation that has long been interpreted as a religious vision and confirmation of afterlife.

Experts now claim it’s a surge of electrical activity triggered by the brain in the moments before death, apparent from a study of the brainwaves of dying patients. “We think the near-death experiences are caused by a surge of electrical energy released as the brain runs out of oxygen,’’ said Lakhmir Chawla, an anaesthesiologist at George Washington University medical centre in Washington. “As blood flow slows down and oxygen levels fall, the cells fire one last electrical impulse

It starts in one part of the brain and spreads in a cascade and this activity gives people vivid mental sensations.’’ Many revived patients have reported being bathed in bright light or suffused with a sense of peace as they start to walk into a light-filled tunnel.

A few even say they experienced visions of religious figures such as Jesus or Prophet Muhammad or Krishna, while others describe floating above their own deathbed, observing the scene. In one of the most famous cases, in 1991, American singer Pam Reynolds reported watching the top of her own skull being removed by surgeons before she moved into a bright glowing realm, including detailed accounts of the surgery and conversations by her surgeons.

Chawla’s research involved an electroencephalograph (EEG), a device that measures brain activity, to monitor seven terminally ill people. He noticed that moments before death, the patients experienced a burst in brainwave activity lasting from 30 seconds to three minutes. The brain activity was similar to that seen in people who are fully conscious, even though the patients appeared asleep. Soon after the surge abated, the patients were pronounced dead. Chawla’s research, published in the Journal of Palliative Medicine, is thought to be the first to suggest that near-death experiences have a particular physiological cause.

Although it describes only seven patients, he says he has seen the same things happening “at least 50 times’’ as people die. Other scientific studies suggest that 15-20% of people who go through cardiac arrest and clinical death report lucid, wellstructured thought processes, reasoning, memories and sometimes detailed recall of events during their encounter with death. Chawla is now planning a further study, using much more advanced EEG machines to follow exactly what happens to the brain during death. “Our findings do not really tell us anything about whether there is an afterlife or not. Even if these near-death experiences turn out to be a purely biochemical event, there could still be a God,’’ he said.


A carbon ‘burp’ ended ice age

An international team led by a scientist from the University of Cambridge has found the possible source of a huge carbon dioxide “burp” that happened some 18,000 years ago and which helped to end the last ice age.

The results published in journal Science provide the first concrete evidence that carbon-di-oxide was more efficiently locked away in the deep ocean during the last ice age, turning the deep sea into a more “stagnant” carbon repository, something scientists have long suspected but lacked data to support this theory.

The team led by Dr Luke Skinner of the University of Cambridge radiocarbon dated shells left behind by tiny marine creatures called foraminifera (forams for short). By measuring how much carbon-14 (14C) was in the bottom-dwelling forams’ shells, and comparing this with the amount of 14C in the atmosphere at the time, they were able to work out how long the carbon-di-oxide had been locked in the ocean, a university release said. By linking their marine core to the Antarctic ice-cores using the temperature signal recorded in both archives, the team was also able compare the results directly with the ice-core record of past atmospheric carbon- di-oxide variability.

Throughout the past two million years, the earth has alternated between ice ages and warmer interglacials.


When is a large document too large? Some rules of thumb for sizing Web Intelligence.

In a typical scenario, Web Intelligence is used by report authors who want to deliver interactive content to consumers. Sometimes authors have pressure to create extremely large WebI documents that ideally will provide answers to any possible question a consumer might have. One WebI customer is delivering documents spanning nearly 200,000 pages. It works, but the delivery is pretty delicately controlled. In general, however, this tendency to build documents with huge data volumes is the same affliction that causes someone to buy a house with 2 extra bedrooms on the chance that someday somehow both sets of grandparents will come calling the same weekend. It might just happen. Whew! What peace of mind to have shelved out an extra $150k for two rooms you might simultaneously fill once a decade!

In the WebI world, the desire to build reports with large data volumes pushes inevitably the limits that the Webi server (for dHTML or Java clients) or Webi Rich Client can handle. WebI memory use, which of course increases as the document size increases, is limited by the processor. As of XI 3.1, at 32 bits, the processor limit for Webi doc size is around 2 gigabytes. This memory is consumed by the storage of the DP results, calculations and the report engine.

In practice, from the perspective of data in the data provider(s) results, WebI can support until a maximum of around 500 megabytes, being roughly 25 million values (e.g. 5 cols * 5 million rows, 25 columns by 1 million rows, etc.). In this case, the document will consume all memory available to the process. For the Rich Client, the content can be consumed offline on each user’s machine, so this memory is not shared. For online clients, the process must be shared by each concurrent client on a given server, so divide the document size limit by number of concurrent users (e.g. 10 concurrent users on a doc of 2.5 million values could max out the server).

Again, it’s important to note that these are rules of thumbs, not absolutes. You might find the server performing adequately even with such gigantic documents. However, the size of a WebI document in terms of rows/cells is not the only variable in play. Synchronization of dimensions between multiple data sources and the number of variables also has an impact, as does the complexity of the layout. So, a 10 million value document with multiple sources and lots of complex variables and calculations and a lot of charts and tables and conditional formatting might put pressure on the server as much as a table-dump with 25 million values.

But before you start categorizing WebI as a client that is best for small documents, let’s step back and think about what a document with 25 million values means to report authoring and interactive consumption. First, it’s absurdly large. Just for reference, a 5 million word MS Word document could easily be more than 10,000 pages. Second, it’s absurdly large. Take the example of a query that retrieves 500,000 rows and 50 columns – 25 million cell values. Among those columns, you might have Region, Country, City, Year, Quarter, Month, Product Level 1… Product Level n, Industry Code, Customer Name, Address…. And then of course there are measures like Sales, Units sold, various Counts, etc. Maybe these columns/rows are fed by another source or two – a customer support site plus maybe even an Excel file with forecast data. This report is great! It contains the sandbox for answering any question at any levels. Just which of the dozen tabs should I click on and how long should I scroll to get my answer? Third, it’s absurdly large. Maintaining a document this large – with all of its different analytical views and web of interdependent variables and calculations – is always going to be painful. You better hope the author never leaves the organization, because untangling the intentions and interdependencies within such a large document will be next to impossible.

How will users, concretely, consume this volume of data? A handful of aggregation in a table with options to drill to 4 or 5 levels of details at any class of dimensions – Geography, Time, Product Levels, Customers, etc.? In this case, since the author is not adding value to the content – either through formatting, calculations and variables, juxtaposition of content to tell a story or synchronization of data between different sources, etc. – this is a clear case where Explorer would bring value to this scenario. If the expectation is for consumers to explore the data “as is”, then give them Explorer to let them find their factoid answers at any detail of data.

Often in these scenarios, the author does try to add value by defining a multi-tab WebI document with dozens of different detailed views of the data organized in different themes. These documents take weeks to create and validate and could take days to go through, but in theory they enable users to have any of their possible questions answered, no matter how detailed and unlikely they may be. For the vast majority of users, putting content detailed as this into one document is like buying flood insurance if you live in the Sahara. Yes, it is possible that someone, one day, might want access to those details in one session. And over years, a user might access quite a few details related to numerous dimensions. However, is it worth the price of performance to buy such insurance with a document that size? Instead of building such a monstrosity as one document, consider alternatives for delivering the same content in a couple, linked documents, using features such as query drill or prompts (optional prompts might help), using queries to pull specific levels of aggregation instead of in one giant chunk, allowing more ad-hoc query and document creation with compulsory prompts turned on, etc.

Rest assured, however, that help is on the way for customers insisting on such humongous documents. In an upcoming release, we plan on making WebI 64 bit, effectively removing the process limit and enabling the addition of more physical memory on the server to improve the handling of larger reports/throughput. The Rich Client on an end-user’s machine, which uses a “local server” implementation, will also become 64 bit in a future release. (Note that the Rich Client has essentially repatriated some Webi server components to be part of the Rich Client installation. This is what enables WebI content to be taken offline from BOE.)

But in the meantime, authors should constantly check the trade-offs involved with building extremely large documents. And ultimately, it comes down to understanding what end users (consumers) really want to and need to do with the content. Next time you get pressure to create a 12-tab, 50 dimension, one-size-fits-all report, push back a little. The consumers are not always right. Ask how many 10,000 page word documents they use.

Michael Thompson is Director of Product Management at SAP Business Objects.


Office 2010 – SAP Gives the Go-Ahead

Microsoft Office 2010 is already available for enterprise customers, while users of the home edition will have to hang on until June 15. Microsoft promises to make Office our constant companion – but just how compatible is the software with SAP systems?

Wählerische SAP GUI: 32 Bit ja, 64 Bit nein (Bild: Jan Meyer)
SAP GUI says yes to 32-bit but no to 64-bit (graphic: Jan Meyer)

The current SAP GUI 7.20, which SAP launched in April 2010, supports both the latest Microsoft operating system Windows 7 and the new Office package. But while the SAP GUI runs easily on the 32-bit and the 64-bit versions of Windows 7, SAP is only giving the go-ahead for the 32-bit version of Office. Why is that?

Microsoft now offers a 64-bit version of its Office suit. However, SAP GUI 7.20 is not compatible with the 64-bit Office version. This is because the SAP GUI is a 32-bit application. When it runs on Windows 7 (64-bit), the operating system’s integrated emulation mode kicks in. But the 32-bit SAP GUI and plug-ins such as Outlook integration cannot communicate with the 64-bit Office version. SAP is aware of the incompatibility and is currently examining two different approaches to address the issue. The result will probably be a 64-bit SAP GUI – but there is currently no indication of when this might become available.

Collaboration, smartphones, and the Web office

Microsoft promises those who buy the new Office suite that it will be their constant companion, wherever they go. According to Ralph Haupter, the new head of Microsoft Germany, a tailored version of Office 2010 will be available for all the major platforms (PC, smartphone, and Web). Thanks to the new Office Web Apps – online versions of Word, Excel, PowerPoint, and OneNote – files can be processed, downloaded, and published using the Web browser.

Zuversichtlich: Der neue Microsoft Deutschland CEO Ralph Haupter beim Deutschland-Launch (Foto: Benjamin Blaume)
Ralph Haupter, the new CEO of Microsoft Germany, oozes confidence at the launch in Germany (photo: Benjamin Blaume)

In combination with Microsoft SharePoint 2010, which was launched at the same time, Office 2010 is slated to strengthen its position as a collaboration platform. Several people can work on opened files at the same time, while social networks such as Facebook and the Microsoft service Windows Live can be integrated using Outlook Social Connector.

Haupter predicts that most companies will upgrade to the new Office version in the medium term. His confidence stems from the fact that the Office version was already very stable when it hit the market – thanks to the biggest beta test of all time (8.6 million users worldwide, 520,000 in Germany) – and the license model adopted. Haupter says that 50% of large companies in Germany can upgrade to Office 2010 with no additional costs, because their enterprise license agreements already include the licenses required.

Anyone who would like to try out Office 2010 can download a 60-day version of Office Professional Plus from the Microsoft Web site ( The beta version for private customers will expire in October 2010.


Microsoft chief shrugs off Apple rise

Microsoft chief shrugs off Apple rise AFP – Microsoft chief executive Steve Ballmer at a press conference in New Delhi on May 27. Ballmer on Thursday …

NEW DELHI (AFP) – Microsoft chief executive Steve Ballmer on Thursday said he was unconcerned that his company had been overtaken by rival Apple as the world’s biggest technology firm in terms of market value.

Ballmer spoke to reporters in New Delhi day after Apple, maker of the iPod, iPhone and iPad, passed the software giant for the first time.

“It is a long game. We have good competitors but we too are very good competitors,” he said. “I will make more profit and certainly there is no technology company on the planet that is as profitable as we are.”

“Let’s see what happens as I am still pleased that 94 times out of a 100 somebody picks a Windows PC,” he said.

Ballmer was in India to underline the future importance of Microsoft’s cloud services platform, in which people use applications hosted online instead of buying, installing and maintaining software on their own computers.

“India will not only see a surge in consumption of cloud services, driving growth in domestic IT usage, but companies all over the world will look to India for their transition to cloud computing,” he said.

The technology is expected to create more than 300,000 jobs by 2015 in India, he said.

Microsoft shares shed 4.07 percent on Wednesday to close at 25.01 dollars, dropping its market capitalisation — the number of shares outstanding multiplied by the stock price — to 219.18 billion dollars.

Apple shares lost 0.45 percent meanwhile to close at 244.05 dollars, giving the company a market value of 222.07 billion dollars.

Apple stock has risen steadily as chief executive Steve Jobs, who returned to Apple in 1997 after a stint away, piloted the release of hit products starting with the iPod in 2001.

VIA YahooNews