Archive for May, 2015
If you are using Crystal Reports to read PeopleSoft data you may be in for a nasty surprise soon. Once you upgrade to PeopeTools 8.55 you will no longer be able to use Crystal Reports.
According to this support thread (available to users with support subscriptions) there will be no way to connect to the PeopleSoft security layer in PT8.55. And the process scheduler will no longer recognize Crystal Reports. Users are being told that they will have to replace Crystal Reports with Oracle’s BI Publisher tool, but I am being told that BIP can’t match the features of Crystal Reports. I am going to ask for some specific examples and will try to incorporate the Oracle BI Publisher into my comparison of reporting tools.
If this affects you, it might not hurt to let your contacts at PeopleSoft know.
I have just updated my comparison of server-based scheduling tools for 2015. These tools are similar to the desktop-based scheduling tools I write about every March, but these are designed to be run on server. This allows multiple people to schedule reports for automated delivery by Email, FTP or network folder.
There are 9 products on the list this year with one new release and another being discontinued. There are also a few feature updates and price changes since last year. The blog page provides a brief overview of each product. It also has a link to the feature matrix that compares roughly 70 features of these tools. There is even a feature glossary that defines all the terms. So if you need a short course in automating Crystal Reports delivery, this is a pretty good place to start.
Last month I wrote about report that took 20 minutes to run, and how using the right indexes brought the run time to under a minute. Yesterday I was able to get another 20-minute report to run in under a minute by fixing a different issue.
At first I wasn’t sure if the run time could be significantly reduced. The report had to pull tables from two different databases, and that is usually a performance killer. So I checked the SQL being generated by the report to see how the two queries were being divided. Instead of two separate queries there were four. One of the two connections was showing up as 3 separate queries in the SQL – as if it were three different connections. So we went into the menu at “Database >> Set DataSource Location” and found that the report was using three different instances of same connection. Once we set all three instances to the same instance, the report ran in under a minute.
So why would tables that all come from one database connection end up under different instances of that connection? Usually I see this happen when the report is designed in stages. A few tables are added, then the user logs out and then more tables are added at a later time. Each new login can be treated as a separate instance of the database. And when that happens Crystal will make a separate query for each instance and combine the data in local memory. This is very inefficient when compared to a single query that is handled entirely by the database.
Having two different instances of the same DB causes the same performance problem as connecting two different databases. But while it is very difficult to improve performance with two different databases, merging multiple instances on one database is usually pretty simple.
There are environments where the only way to test a report is to run it from within an application. The steps to deploy a modified report vary, but they usually involve placing the modified report into a specific folder and/or publishing the report into the application. Sometimes the users aren’t clear on the steps. So when a user reports that a modified report returns the exact same result as the original, I have to wonder if they are actually still running the original. It may be that they missed a step when deploying the new report. Or it may be that the application still has a cached copy of the original report in memory and needs to be restarted to see the modified report.
The most reliable way to confirm that the report being run is the latest version is to mark the report with something obvious. For instance I often take a text object from the page header and underline it. If they run the report from the app and don’t see the underlined object then they know that they are not deploying the updated version correctly. Most people start out thinking this test is a waste of time. But more often than not we find that there is some key step that they forgot. This simple step has saved hours of troubleshooting time.
And if you have to work regularly on reports like this, you should read my previous article on exporting to RPT format. That might allow you to bring data from the application back to the Crystal Reports designer so that you can immediately see the results of your design changes.
Crystal Reports is owned by Business Objects and Business Objects is now part of SAP. So I notice when SAP makes the news.
A study by Onapsis Research Labs, reported by Help Net Security finds that 95% of SAP systems are exposed to vulnerabilities. It lists the top 3 vectors for attacks and discusses the operational problems that contribute to these vulnerabilities. Sounds serious.
On the same day I read an unrelated story about another cyber security company, called Tiversa. According to a former Tiversa employee, the company used extortion, punishing companies that didn’t purchase their services by reporting their vulnerabilities to the FTC. Now there there are calls for a federal investigation.
That makes me read reports of security vulnerabilities through an entirely different lens. There has always been a fine line between protection and a protection racket.
In February I started an ambitious project. I expect it to hit critical mass in 6-12 months. I want to compare Crystal Reports to the other leading BI tools. I plan to include SSRS, MS Access, Tablaeu, QlikView, Indicee, Logi Ad Hoc, List and Label and a few others. The goal will be to help users understand how these tools are different and therefore which tool is best for a specific set of requirements.
One challenge is that these tools are very different in both purpose and approach. So my plan is to create a detailed feature matrix showing what each tool can do and also how it is different. The process and the end result will resemble the comparisons I do for third party products.
Another challenge is that I am not an expert in most of these tools. So, like I do in my other comparisons, I will rely on the people who know the tools best. Ideally the vendors will provide the information directly. One vendor already has. Vendors who want their software represented accurately have some incentive to participate. And when the vendor doesn’t participate I will recruit competent users to review the feature list and mark the features supported by each product.
My job will be to tease out the features that best highlight the differences between products. I will also have to write up the feature definitions so that they are objective and meaningful.
Would you like to help? There are several ways to get involved:
1) Tell me the tools you think I should include, especially ones I didn’t mention. That will help me prioritize products.
2) If you have expertise in any of these tools you can volunteer to review the feature list for that product.
3) Even if you have only limited experience with one of these tools, your impressions would be welcome.