Deski is back! Using the BI4.1 Deski Compatibility Pack

Ok.  Deski isn’t exactly “back” but you no longer have to wait to get your Deski reports converted to Web Intelligence to move to the latest SAP BusinessObjects platform. In the past couple of years, I’ve chatted with several companies who are long time Deski shops.  When they hear that Deski has been deprecated at 4.0 and they need to convert their Deski reports to Webi, they cringe.  And I’ve had clients that have 6000+ Deski reports!

The release of the SAP BusinessObjects 4.1 SP3 makes available the new Desktop Intelligence Compatibility Pack.  What this means for our Deski users is that you can now host Deski reports in the BI4.1 SP3 Platform.  You can apply security to these reports, view the history of scheduled instances, and cut / copy / paste.  What you cannot do in the Platform is open, refresh, create new or schedule Deski reports.  All of this happens within the Desktop Intelligence client application.

How to connect Deski to BI 4.1

To get the Desktop Intelligence client application, go to the SAP Service Marketplace (service.sap.com) and download SAP BusinessObjects 3.1 Client Tools at SP6 Patch 1 or higher.  This patch allows you to connect your Desktop Intelligence application to the 4.1 SP3 platform.  From there, you use the standard method of opening reports to retrieve a report from the platform.  You can then modify and refresh the report to adhere to your requirements and see the latest data.

Using Desktop Intelligence with the BI 4.1 Platform

Once the Desktop Intelligence client SP6.1+ is installed, you can connect to the platform as you normally would.  When you fire up Deski for the first time, it asks you to log into the platform.  Specify the name of your BI 4.1 CMS node or cluster in the System field along with a valid user name, password and authentication type.  Clicking OK will attempt to connect to the repository and will give you the standard report creation template if it successfully connects.  You will be able to see the repository name and port on the top left corner of Deski once connected.

To open an existing Deski report that has been migrated to the BI 4.1 platform, simply click File then Import from Repository.  You have the option of viewing Folders or Categories.  You select the location where your Deski reports were previously saved and click Retrieve.  Once the report has been opened, its business as usual.  You can refresh the data, modify the report, perform analysis and even export the report back to the repository.

To save a Deski report back to the repository, simply click File then Export to Repository.  This provides you the ability to maintain Deski reports according to reporting requirements so that your report development doesn’t come to a complete standstill when the platform is upgraded.  Keep in mind, however, that Desktop Intelligence isn’t coming back and major maintenance and creation of new reports is discouraged.  These reports should be converted to Webi as soon as possible.

Scheduling a Deski Report

When the SAP BusinessObjects 4.1 SP3 platform is installed, the Desktop Intelligence Compatibility Pack is automatically included.  What is not included are any Deski servers that handle the execution or scheduling of reports.  These processes all happen on the desktop client.  To schedule a Deski report, open the Desktop Intelligence client and select the report to be scheduled from the Repository listing then click the schedule button.  A window appears that contains three tabs; General, Change Schedule and Distribution.  The contents of these three tabs will look familiar to you if you are familiar with standard scheduling in SAP BusinessObjects.  Set your formats, printing options, run times and distribution options then click Ok.

Since there are no job servers on the BI 4.1 platform to handle the scheduling of Deski reports, the report instances are executed as a standard Windows Task.  In Windows, go to your Start button and type in “Task” and open Task Scheduler.  Open the Task Scheduler Library -> SAP BusinessObjects and look for the instance that was just created.  You can also look at the History in the BI 4.1 Platform to see the status of the instance that you just created.

Wrapping It Up

It’s useful that users can now move to the latest BI platform to take advantage of all the great new features (Lumira, Mobile, etc) while not succumbing to the stress of a huge conversion effort.  The Desktop Intelligence Compatibility Pack provides the necessary bridge to these customers to continue using reports while migrating the reports to Web Intelligence.  It is important to understand that you can refresh and schedule reports but only using the Deski desktop client.  If you migrate Deski reports from XI R2 or BI3.1 to BI4.1, those existing schedules will fail because there are no Deski Job servers in 4.1 to handle the jobs.  So if you heavily rely on Deski and have a desire to upgrade to the latest 4.1 version of the SAP BusinessObjects platform, consider starting that project!  Just keep in mind the limitations.

Summary:

Pros:

Host Deski reports in the latest platform

Create and maintain Deski reports based on requirements

View historical instances in the report’s History in the Launch Pad (just not for .rep destination types)

Cons:

No refreshing or scheduling of Deski reports in the Platform

Deski reports cannot be viewed in the Platform, only in the desktop client

XIR2 Users must install BI3.1 SP6.1+ Client to connect to BI4.1

Reference:

http://scn.sap.com/docs/DOC-43592

Advertisements

Two Days in the Life: My Experience at BI2015 (Part 2)

Tuesday March 10 – Day 2

So my second day didn’t look like it was going to be hectic but things aren’t always how they appear.  On Tuesday, I was scheduled for booth time and two sessions.  I turned into another intense day!  I actually enjoyed it all!

I started on a call with SAP in the morning but learned pretty quickly that it was rescheduled.  Score!  I now have time to work with LinkedIn, do some research and send and respond to emails.  I enjoy using LinkedIn to keep in touch with everyone that I meet!

On my way down to the booth, I crossed paths with the legendary Mico Yuk.  Full of energy and wearing great shoes, we quickly caught up before continuing on our way.  Once I made it to the booth, Ryan Goodman stopped by to chat and show me the latest developments with Centigon’s CMaps Analytics.  There’s some truly cool stuff that’s very easy to use here.  We spent some time in the Analytics pod looking at their work and discussing the details of embedding mapping technologies into Web Intelligence.

I was then asked to join a discussion on visualizations and HANA with a group from Decision First and a guest in the booth.  After discussing visualizations and HANA in the guests environment, I returned to the Analytics pod to help demo Lumira Desktop, Edge and Server and discussed the differences among the offerings.  Once the demos concluded, a quick check of my watch told me that there was no time for lunch before I had to go test the lab for my 1 pm Web Intelligence Session.

Similar to the previous day, the Web Intelligence session was packed.  The guys even added more laptops to accommodate the interest!  We covered the basic to advanced topics and took even more great questions.

After the session, I went back to the booth and walked down to Pitney Bowes to learn more about their MapInfo integration solution for Web Intelligence.  I really enjoyed the demo and am appreciative that they answered all of my questions.  There’s some pretty impressive tech going on there!

I got into a discussion with one Decision First sales rep about a blog that I wrote a couple of years ago on linking BPC on Microsoft SSAS vs BPC on SAP NetWeaver.  Along the way, I was happy that I could be at the Welcome pod to give away a couple of our books “Implementing SAP HANA”.

It seems that the time flew by once again and it was time to get back to my hands on lab to ensure that the servers were prepared for the second Dashboards session.  For a 4th time, everything was flawless and we had another great session on Dashboards.  I compiled some of my favorite questions from that session in this blog.

I then attended a client party at Emeril’s that was cohosted by Decision First and NIMBL, our new SAP partner.  Sampling a new merlot, I met quite a few new people and was able to catch up with some folks that I haven’t seen in a while.  I then joined a few of the Decision First folks and a client at Fiamma for a late dinner.  We had conversation around HANA, Analytics, architectures, grappa and lemoncello!

Because I had a 6 am flight on Wednesday morning back to Atlanta, I was in bed by 11:30 to be up by 3:30 am.

In a nutshell, I describe the conference as busy, intense, successful and engaging.  If you were able to attend, I hope that you enjoyed the conference and got as much out of it as I did as an attendee and presenter.  If you were not able to attend, I hope that you check out the Reporting and Analytics Conference this fall (R&A 2015) or the BI 2016 conference next year.  Decision First has discounted admission rates that can save you quite a bit on your budget in conference attendance and you’ll get invitations to the best parties at the conference!

Two Days in the Life: My Experience at BI2015 (Part 1)

Monday March 9 – Day 1

If you’ve never attended one of the SAPInsider conferences, I have one word to describe it: intense.  I’ve been on the SAPInsider speaking circuit since 2011 and each show gets better and better.  I just returned back to Atlanta from the BI2015 conference in Las Vegas and thought that I would share my personal experience on what attending one of these conferences can be like.  As you will see, a LOT happens during the short 2 days that I was in Vegas but I’ll try to be concise.

Aside from booth duty with my company, Decision First Technologies, I delivered two sessions: Web Intelligence and Dashboards.  These two sessions were also repeated due to the high demand so I effectively had 4 sessions.  In addition, I was selected for the SAP Experts 1:1 session on Monday.  I decided to fly in on Sunday evening since I was scheduled to test the Webi and Dashboards lab on Monday at 7am.  On the way to the lab, I saw Bridget from SAPInsider and she personally checked me into the conference!  I got in there at 7am and tested the servers.  They were all perfectly configured and ready to go.  Thanks Mike, Rey & team for the great work!

I then met my colleagues at a table near the registration booth and jumped on a client call to discuss the different visualization options and ways to streamline a demo to C suite executives.  I walked down to the keynote hall and rejoined my colleagues.  SAP’s new CTO Quentin Clark discussed everything that’s going on with SAP especially from the SAP HANA side of things.  Jayne Landry then came on stage to announce SAP Lumira Edge edition!  Adrian Westmoreland joined her on stage and gave a live demo!  Those are always tricky on stage but he showed it flawlessly.  I blogged about SAP Lumira Edge before the conference.

After the keynote, I went over to the Decision First booth.  Another huge presence at the conference!  If you’ve not visited, we had 4 pods: Welcome, SAP HANA, EIM and Analytics.  You can move from pod to pod to learn more about the different technologies and get your hands on it.  There’s a hangout space in the middle with a fantastic padded floor that’s a lifesaver after being on your feet all day.

I joined in on conversations around visualizations and the differentiations between the versions of Lumira Server.  I would randomly hear “We have one of the authors here” when the Welcome pod was presenting our book “Implementing SAP HANA”.  I enjoy talking about that creative process.  Eric Vallo stopped over and chatted for a while.  Afterwards, I walked around a bit (in search of a Diet Coke) and had a chat with APOS to discuss their latest offerings.  Some great tools and functionalities are there!

Once I made it back to the booth, I had the great honor of meeting Jayne Landry from the morning’s keynote!  We discussed my enthusiasm for Lumira Edge and ways that we are getting the word out to our clients.  With stars in my eyes, I knew I had to eat before my 1:30 session started so I joined my colleagues for lunch where we discussed the new Predictive Analytics 2.0.

I made it to the hands on lab room around 1pm to test the servers (there were 40!) to ensure a smooth class.  Everything was humming along and people starting coming in.  We filled up very quickly and we even had a line down the hallway.  I asked if anyone in the room would be willing to share and had a lot of hands.  People were there to help others so we fit everyone into the room.  We had a great session on Advanced Topics in Web Intelligence and I imparted my usual bad jokes and my advocacy of sap.com/learnbi.

Once the session was done, I went back to the booth for more conversation around Lumira, Lumira Edge and general reporting and dashboarding with a few demos in the Analytics pod.

That time went by very quickly and it was time to go back to the hands on lab to prepare for the Dashboarding session.  Once again, flawless servers and a packed room.  We discussed, among other things, the roadmap for Dashboards (Xcelsius) and whether what we were about to cover is unnecessary (It isn’t!).  Typically we have folks in the room who have never used dashboards and some that have, so I geared my presentation towards both.  We learned how to build dashboards as well as implementing advanced functionality in areas that include interactivity and advanced formatting.

Once the Dashboards session was complete, I had to RUN to the Exhibition Hall for the 6:30pm SAP Experts 1:1 sessions.  There were folks already at the table including Mr. Don Loden who was there to discuss everything data services.  My topics included all of the front end tools, architectures and configurations.  I talked with various folks at the table regarding the decision making between SAP BI technologies vs Microsoft technologies, advanced architecture planning and pitfalls and migrations from older versions of SAP BusinessObjects to the newest version.  There’s never any easy answers to questions like this but we find ways to discover options and determine courses of action.

Once the Experts session was over, I was really looking forward to a 7:30 client dinner at Tom Colicchio’s CRAFTSTEAK at MGM Grand.  When I work with my clients, I try to understand their passion for what they do.  When I provide my deliverables, I look forward to their responses on how it will help them in what they do.  That is my payback.  Because we get into discussions around such things, I tend to form personal interests in what they are doing and therefore love keeping in contact with the client.  This was the case with my dinner companions.  We discussed their plans for the future and ways that we can get questions answered.  We discussed their past engagements and our families.   We had a great bottle of wine and the 3 hours seemed to fly by.

I’m not much of a night owl nor am I a gambler, so I decided to go back to my room for the night.  With my stomach full and my feet hurting, I recounted to myself what happened during the day.  As you can see above, it was quite a bit.  I went to bed.

Continue to Part 2.

Reporting On SAP BusinessObjects BPC in SAP BusinessObjects 4

The Assignment

On a recent client engagement, our statement of work dictated that we build reports and dashboards from a SAP BusinessObjects Budget, Planning and Consolidation (BPC) environment.    In SAP BusinessObjects 4, we have two methods of hosting a BPC environment: SAP NetWeaver and Microsoft Analytic Services.

In this engagement, we are building reports against SAP BusinessObjects BPC for the Microsoft Platform 10.  The standard workflow that we follow is to build the OLAP connection to the BPC system.  Ideally, we would connect to the BPC application via the BPC connector.  Next, we would build and publish a semantic layer on the newly created OLAP connection.  Finally, we would build our reports and visualizations based on the newly created Semantic Layer.

The Problem

After working with the standard BPC connector for our BPC installation (SAP BusinessObjects Planning and Consolidation for the Microsoft Platform 10), I started encountering very strange issues.  First of all, I was able to create the OLAP connection and build the semantic layer on the connection.  When I started using the semantic layer in Web Intelligence, I noticed that the performance of the system was abysmal and the data coming from BPC was not correct.  Every time I opened the Member Selector for our various objects, it would take approximately 3 – 5 minutes to populate the members.

It also seemed that the Internet Information Services (IIS) application on which the BPC web service was becoming bogged down.  Restarting IIS seemed to temporarily fix the issue.  The issue continued to happen again and again.  I also encountered an issue where I could build a report on the BPC system and the data would refresh the first time.  When I tried to change the query or refresh the query, I get a message that GroupSets (SmartMeasures) are not supported.  Keep in mind that I did not change the query at all, but simply refreshed the report.  If I drop out the Measure, then the report will refresh.  If I add the measure back in, the report will refresh the first time but not again.  I then get the same GroupSets (SmartMeasures) error.

Finally, I started getting an error that is related to a time hierarchy.  Running a Webi Report with this hierarchy started throwing the generic “WIS_30270” error.  When I look at the logging files available on the BI4 server, the errors relate to a potential date conversion that Webi is trying to do.  I changed the qualification on the universe level to Standard (from Time) and the issue is still happening.  The field is showing as a Time object in the Operations cube and the IDT is recognizing it as such, but I think that the values of the field (i.e. 2012.Total) is throwing off the date parsing since 2012.Total isn’t a valid date.

The Work Around

Some of our consultants have long standing experience in connecting SAP BusinessObjects reporting and analytics to BPC systems.  I decided to try connecting directly to Microsoft SQL Server Analysis Services, which is where the InfoCubes are stored.  To do this, I followed the steps in the following Microsoft article:

Configure HTTP Access to Analysis Services on Internet Information Services (IIS) 7.0: http://msdn.microsoft.com/en-us/library/gg492140.aspx

Once I set up the msmdpump.dll within the confines of a Web Service hosted on IIS within its own application pool, I was able to successfully test the Analysis Services web service using Excel and Management Studio as recommended within the Microsoft Article.  Finally, I created a new OLAP connection in the CMC.  This time I selected Microsoft Analysis Services 2008.

 

Figure 1: OLAP Connection Options in the CMC

 

Figure 2: A Sample OLAP Connection to MSAS

Once the Analysis Services connection was created and the semantic layer was updated to use the new connection, the reporting and analytics processes worked very well and as expected.  The biggest disadvantage to this approach is that the security established within the BPC application system is being bypassed.  The Analysis Services approach connects directly to the cubes running on the instance of Microsoft SQL Server and does not take into account any security as it was established in BPC.

The Final Product

Once I established a connection directly to Analysis Services, the semantic layer, Web Intelligence reports and Dashboards worked very well.  The response time was fast and the data was accurate.  It took very little time to build useful reports and dashboards that could be published, scheduled and executed to allow a high level of detail within SAP BusinessObjects 4 while being connected directly to BPC.  SAP is aware of the issues related to the native BPC OLAP Connector and is currently researching the issue.  If you encounter errors with the BPC OLAP Connector in SAP BusinessObjects 4, consider switching to the MSAS connector until the issue is resolved

Repairing a Corrupted BI Services Installation

After working with a client on a dashboarding project in SAP BusinessObjects 4, we encountered an error with SAP BusinessObjects Web Intelligence when we were trying to connect to BI Services.  We could neither connect nor could we publish.  After some investigation, it seems that we needed to redeploy dswsbobje.  Here’s a bit of information that we gathered and the resolution that we discovered.

Symptoms:

Connecting to BI services in Webi in launch pad or rich client will throw random XML parser errors such as XML reader error: javax.xml.stream.XMLStreamException: ParseError at [row,col]:[1089,5]Message: XML document structures must restart and end within the same entity.

You cannot see existing web services nor publish new web services.

Opening the URL http://<servername:port>/dswsbobje/services/Session?wsdl goes into an infinite loop and will crash your browser.  Ensure that wsdl is lowercase.

Cause:

From time to time, BI Service can become corrupt and cause the symptoms that are shown above.

Resolution:

Redeploy the dswsbobje application. To redeploy, follow these steps on the BI4 server:

  1. Stop Tomcat and the SIA using the CCM
  2. Open a command prompt
  3. Go to the directory: C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\wdeploy
  4. Run the command: wdeploy tomcat6 -DAPP=dswsbobje deploy
  5. Look for Build Successful at the bottom of the resulting output.  If you do not see Build Successful, rerun the wdeploy command
  6. Start Tomcat and the SIA using the CCM

Go to the URL http:// <servername:port>/dswsbobje/services/Session?wsdl.  The browser should not crash.

Try to connect to BI Services in Webi (launch pad or rich client).  The issue should be resolved.

Personalizing an Exploration View

SAP BusinessObjects Explorer provides the ability to interact with data providing insight that goes beyond standard reports.  However, the question of data security must be addressed when we interact with this data.  One of our recent clients had a requirement to provide users the ability to use the SAP BusinessObjects Experience  iOS and desktop app while restricting access on data to which they have access.  We worked on this demo and provided a very compelling experience for our client.  In this blog, I will share with you what we did to fulfill these requirements.

Database Considerations

First of all, we started with a data source.  The data source that we have contains eFashion data migrated from Microsoft Access to Microsoft SQL Server.  Of course, the data provider doesn’t matter.  Secondly, we added a table that contains a row level mapping of data to individual users.  For instance, we had a restriction that requires a user to see two states: California and Colorado.  A second user should be able to see DC, Florida and New York.  A third user should be able to see all states.  A fourth user should have no access.  Here is the view of the table that we created:

Manager State
James.Mason California
James.Mason Colorado
James.Mason DC
James.Mason Florida
James.Mason Illinois
James.Mason Massachusetts
James.Mason New York
James.Mason Texas
Marshall.Kelley DC
Marshall.Kelley Florida
Marshall.Kelley New York
Mel.Whatley California
Mel.Whatley Colorado

Next, we updated the standard eFashion universe to include this new security table and link it together with the state field of the existing Store table.  That’s it for the data source.  We can now link data with individual users on the database level.

SAP BusinessObjects Explorer

Now that the data source is created and the universe is developed to use the database, we can now create our first Information Space on top of the data source.  Create a new Information Space containing fields for the manager name and the state.  This association represents the data to which each user has access.  Finally, save and index the new Information Space.  This is the Reference.

Next, build a second Information Space that contains all of the facets and measures to be used with Explorer and Experience.  Specify the correct properties, objects and scheduling (if needed).  If you have implemented security on the database or universe level, Explorer effectively strips that security away when you index the Information Space.  Data is brought into Explorer from the data source according to the security of the user performing the index.  When users execute the Information Space in Explorer, they see the data that the indexer had access to see.

The key step in enabling security is to use the Reference Information Space in the final Information Space.  Explorer compares the current User Name with the specified Manager field and performs the filtering on demand.  As users log into SAP BusinessObjects or into the iPad, the Manager field is filtered according to the user’s login id.  If a user’s login id does not exist in the Manager field, no data will be returned and the user will see a message stating this fact.

Personalization

To establish a personalized Information Space, click the Personalization tab of the final Information Space.  Click the check mark beside “Personalize Information Space Exploration” then click the down arrow beside the Exploration View selector to the right.  Browse through the available data providers and select the Reference Information Space.  There are two parts on the bottom of the screen: To Filter and Filtered by.  In the To Filter, map the User Name field to the Manager field in the Filtered by section.  We are filtering States in this example, so select State in the Filtered by section on the same line as the State in To Filter.  Your final Personalization screen should appear like this: 

Exploration Views

Now that we have two Information Spaces created, we can now build Exploration Views from the final Information Space.  The personalization happens on the Information Space and is extended to the (Exploration) views that are built on the spaces.   To create an Exploration View, open the Information Space.  Click Create View Set and design your Exploration View.  Click Save and provide a name.  Switch over to the Exploration View Sets section and click your new Exploration View.  You should now see your design personalized for the currently logged in user.  Below are screenshots for four users.  The first three have access to certain states.  The fourth has no access to any data.

The user above has access to California and Colorado.

The user above has access to DC, Florida and New York

The user above has access to all states

This user has access to nothing.

Conclusion

SAP BusinessObjects Explorer provides a unique visualization tool that allows users to interact with data in new and different ways.  Data security and Explorer are not mutually exclusive.  In this article, we discussed the standard capabilities of Explorer and Exploration Views.  Administrators have the ability to apply Personalization to restrict data for users that explore the personalized data sources.  Users who have access to the spaces and views but no access to the data will be restricted.  Users with partial or full access to the data will see their data represented accordingly.  Only users who have access to modify the underlying Information Spaces can control the Personalization settings within Explorer.

Using Dropbox for Versioning in Dashboard Development

I’ve been a dashboard developer for a few years now and the biggest fear that I’ve had is building a complex dashboard and encountering a corruption when I try to reopen it.  It’s happened a few times in my career and it’s devastating.  Recovering a corrupt dashboard is nearly impossible unless you’re into poring over lines and lines of XML contained within the XLF file.  Even then there are no guarantees that you will recover your hours of work.

I have found a solution that works very well and that solution is Dropbox.  Dropbox is a cloud storage solution that provides users the ability to save files within their Windows folder structure as we do every day.  However, Dropbox monitors the directory for the addition or updating of files and sends the changed files up to their secure cloud.  This is where the magic happens.  When an update is found, Dropbox will take a backup of the previous version of the file and will add the change as a new file!  Here’s how it can work for you.

Go to the Dropbox (http://db.tt/OREYpVH) website and download the client.  It’s a very quick and easy installation and the app has a small memory footprint.  During the installation, you can specify the default location (on a Windows machine it’s c:\users\<username>) or you can specify a different folder to serve as the root.  You can also selectively determine what folders will be synced with a server.  Once you finish this setup, you are ready to go!

 Once the installation is complete, you will have a folder in the location that you specified in the installation that will contain your synced files.  The folder itself will display with a small checkmark when it is fully synced to the cloud.  If it is being refreshed, it will show the double arrow refresh icon.

Next, simply go about your dashboard development as usual.  In this example, I have created a small dashboard.  The content of the dashboard is not important as we are only concerned with the XLF file itself.  I save this file in a new folder called Xcelsius 2008 within the Dropbox structure.

Open a Windows Explorer window and navigate to the saved dashboard.  Right click on the dashboard and select Dropbox in the context menu.  Click View Previous Versions.  A browser will open and will take you directly to the location of the file stored in the cloud.  Since we have only saved the dashboard once, you will see an initial “Added” Entry with the size and time that it was changed.

Go back to the dashboard and make a change.  Click Save to save the changes to the folder.  If you watch the folder icons, you’ll notice that Dropbox sees that there is a change and will upload the change to the cloud.  Now, navigate to the file again and right click to view the context menu then click View Previous Versions.  The browser refreshes with the current historical listing.

Make another change on the dashboard and click Save again.  Once Dropbox adds the instance to the cloud, right click the file name again and select View Previous Versions.  The third instance displays in the browser.

To roll back to the previous version, simply click the radio button that appears to the left of the desired instance and click the Restore button.  Open the XLF file inside of Xcelsius to see the previous version.  When you restore an instance, a Restored event is recorded in Dropbox, giving you the ability to restore to ANY of the previous instances.

Dropbox maintains versions of your files for 30 days then they are automatically removed.  Work that I performed a year ago becomes finalized after 30 days and cannot be rolled back.  Dropbox does make available an add-on called Pack Rat in their Pro version that allows you to maintain your revisions without the 30 day restriction.

This method has saved me many times.  One time in particular I was working on a complex dashboard for an international corporation.  I opened the dashboard one morning and began working as I typically do.  I saved the dashboard, closed Xcelsius and took a break.  When I came back and tried to open the dashboard, I received an error message saying that the dashboard could not be opened.  My heart sunk.  I was able to use Dropbox to roll back to a version that I saved 20 minutes prior so I lost very little work.

In addition to Dropbox, I always maintain daily versions of my dashboards as extra insurance.  Before beginning work for the day, I make a copy of the XLF and append vXX to the end just before the XLF extension.  On the dashboard that I just described, I was up to v43.  Dropbox then takes over and provides an incremental revision history from the point of creation until the most recent save.

Give this method a try!  Give this a try with not just XLF files, but any of your files.  I presented this idea at the SAPInsider Xcelsius Bootcamp last year and it seemed to be well received.  I hope you find this useful and a time saver.  I hope you never have to suffer corrupted dashboards!