Quantcast
Channel: BI Platform
Viewing all 317 articles
Browse latest View live

ASUG BI Licensing Webcast Notes Part 1

$
0
0

This was an ASUG webcast the other week, with a focus on BI (not predictive, HANA)

 

On a different webcast I became aware of this related document about licensing - see here

1fig.jpg

Figure 1: Source: SAP

 

Everyone's contract is different

 

2fig.jpg

 

Figure 2: Source: SAP

 

There have been multiple BI license models over time

 

3fig.jpg

Figure 3: Source: SAP

 

Figure 3 shows the context of BI license models; SAP has previously had add-on models

 

SAP has moved to suite style licensing

 

Differences in BI suite license over on the right including Desktop and Lumira Server

4fig.jpg

Figure 4: Source: SAP

 

Figure 4 shows the core licensing principles

 

There is no obligation or requirement to convert licensing

 

SAP wants to be transparent in license models

 

License models are non-version specific

5fig.jpg

Figure 5: Source: SAP

 

SAP no longer sells CPU licenses to new customers but to existing

 

NUL stands for named user license – for managers, power users, most desktop tools

 

CSBL are for casual users that don’t require guaranteed

 

In the CMC configure NUL or Concurrent

6fig.jpg

Figure 6: Source: SAP

 

Figure 6 shows 1 logon = 1 session

7fig.jpg

Figure 7: Source: SAP

 

It is still one session if navigating between sessions

8fig.jpg

Figure 8: Source: SAP

 

Figure 8 shows SAP is moving away from CPU based licenses because they wanted to remove from constraints

 

Part 2 is coming when time allows

 

Reference:

 

Upcoming ASUG BI Webcast Listing


New SAP Analytics Roadmap deck available

$
0
0

Dear SCN user,

 

We are happy to inform you about the availability of the updated SAP Analytics Roadmap slides.

 

 

Updated features and benefits of Solutions released since the last roadmap, such as:

  • SAPBusinessObjects Business Intelligence platform 4.1, SP5
  • SAP BusinessObjects Mobile 6.1
  • SAP Lumira 1.25
  • SAP Lumira Server
  • SAP BusinessObjects Analysis, edition for Microsoft Office, version 2.0
  • SAP BusinessObjects Design Studio 1.5
  • SAP Predictive Analytics 2.0
  • Updated Planned Innovations for all Solutions

 

You can download the updated roadmaps via the links:

Overall Analytics Roadmap*

Analytics BW Roadmap*

Analytics Agnostic Roadmap*


* User Account required for SAP Support Page



Kind Regards

SAP GTM BI

4.1 SP5 Promotion Manager Constant Issues

$
0
0

The Promotion Manager tool does not bring instances and the UMT refuses to move content from one 4.x system to another. I am currently testing this on BI 4.1 SP5 Patch 5.

 

Can anyone suggest a better way to to move content from 4.1 SP1 to our test box with 4.1 SP5 Patch 5?

We have Over 100,000 reports and need to move several thousand for testing. Why? Because updates to BOE often fail with critical issues, so we can't apply it to our system and hope it works.

    I read in a separate post that SAP will eventually create a thick client for customers who need to move larger amounts of content. I already tried the UMT on this SP5 and it refuses. Does anyone know if and when this new thick client might be coming?

 

CONSTANT ISSUES INCLUDE:

- Fails most of the time

- Some failures only say "Failure" go look at the logs. How about a clue for us inside PM?

- One error on a connection said this, "Relationship would not be a tree after update so bailing". I guess bailing is a strategy for this poorly designed tool. It appears to me that you must bring every universe that uses that connection before it will actually bring it. That's just plain wrong. I may not want those other universes over-writing previous work.

- Duplicate name. This and any other tool needs to allows us to overwrite ANY existing content if we so choose. Someone changed the CUID using Save-As and kept the same name. I need to replace that file -- why not let me? The only solution here is to delete that content and rerun the job. With users and groups, this is at best a large nightmare.

- No instances in scheduled reports come over. In fact, even the report won't come over if the destination report has instances. What kind of choice is that?

Most of our dashboards depend on scheduled reports, so what's the point in not bringing the instances with that content?

 

What else might help?

1. It would be EXCELLENT if SAP designed an Easy Button for mirroring content to another server. It would have to ensure nothing points back to the source system and create new cluster keys. We have tried this manually, it wasn't fun and still has artifacts of the original system.

2. If they are working on a tool to move larger amounts of content, it would be SPLENDID if they also made a way to mirror the security across all folders without having to move all content and all users at the same time. We could move the Groups in batches, then hit the easy button and it magically assigns the groups to the folders, universes, etc. 

SAP HANA and SAP BusinessObjects BI 4.1 (SAP BI) compatibility

$
0
0

Problem statement: Hard dependency with SAP HANA SPSs and BI 4.1 SPs


Currently, from SAP BI 4.1 side, there is ‘one to one’ mapping with SAP BI 4.1 SP Vs SAP HANA SPSs. i.e. SAP BI SP releases are not forward compatible to SAP HANA SPSs – as per Product Availability Matrix (PAM).


Each time customer upgrades SAP HANA to a newer SPS version, it mandates SAP BI 4.1 upgrade as well to the supported SP. This constitutes a significant burden for our customers, sometimes it is a showstopper.


Proposed guideline and solution:

 

Teams internally did additional testing on the newer / previously unclaimed version combinations, to make sure SAP BI + SAP HANA customers will not have to go through this problem in future.


With this, there is a commitment from both SAP HANA and SAP BI team, for the compatibility between them.

For example, all the existing features from BI side will continue to work with new SAP HANA SPS version in this combination. Customer will get the support from SAP’s respective team to resolve the issue with the latest SAP HANA version, if there are any, while they continue with the existing SAP BI SP version in their landscape.


SAP BI PAM documents have been updated with this new proposal, i.e. all active SAP BI 4.1 SP lines will work with latest SAP HANA SPS release. Customer need not update the SAP BI 4.1 landscape, to consume latest SAP HANA SPS version.

 

Following is the model in which we are looking at supporting this combination. Please note that SAP HANA SPS10, and NEXT are not released as of now, so please use this as guideline only. (Refer PAM for actuals)

 

HANACompatibility1.PNG

 

Summary: In general, we would like to ensure the customer that ALL ACTIVE SAP BI 4.1 SPs will connect to latest SAP HANA SPSs. However, we advise you to continue using PAM document as THE reference for the support, to get the latest update on the versions supported and if there is any work-around needed.


Enterprise BI Innovations Roadmap – askSAP Call – Part 3

$
0
0

This is part 3 from yesterday's webcast.  Part 1 is askSAP Analytics Innovations Community Call Notes Part 1 and Part 2 askSAP Analytics Innovations Call Notes Part 2 SAP Lumira Roadmap

 

Please note the usual legal disclaimer applies that things in the future are subject to change.  What I liked particularly about this call was the time spent on question & answer (see below).

1fig.jpg

Figure 1: Source: SAP

 

SAP said they value customers’ feedback

2fig.jpg

Figure 2: Source: SAP

 

Coming for Design Studio includes increasing the number of rows that universes can bring back (today it is 5K), mobile offline support and more as shown in Figure 2

3fig.jpg

Figure 3: Source: SAP

 

Figure 3 covers Analysis Office with a converged Excel client to include EPM, and a new formula editor for 2.1

 

4fig.jpg

Figure 4: Source: SAP

 

Figure 4 covers future plans (subject to change) for Analysis Office, with improved PowerPoint integration and publishing workbooks to the HANA platform

5fig.jpg

Figure 5: Source: SAP

 

Figure 5 covers plans for the future for Web Intelligence (past BI4.1 SP06)

 

Next release for Web Intelligence includes shared objects and annotations

6fig.jpg

Figure 6: Source: SAP

 

Figure 6 covers plans for Mobile BI; SAP is seeing increasing demand for Android

7fig.jpg

Figure 7: Source: SAP

 

Figure 7 shows plans for a faster installer

 

Report comparison tool to save time during the upgrade

 

Linked universes – many projects require universes

 

“Biggest and best partner ecosystems” to extend BI Platform

 

Question & Answer

Q: Universe on BEx query – will it replace anything?

A: Makes it more business friendly for end users for consumption in Web Intelligence

 

Q: Which versions BI Web Intelligence be available

A: SP06 – next week

Future plans – BI4.2 – late this year early next year (forward looking statement)

 

Q: Any future plans for commenting solution for all BI tools

A: Commenting for Web Intelligence is at the platform – WebI is the first to use, looking at other tools

 

Q: Is the performance on WebI on BICS universes similar to BEx queries

A: no performance numbers to verify

 

Q: Lumira isn’t supported on Crystal Server? What do those customers do?

A: Technologically speaking can do this but now focused on Lumira server for teams – you should be able to connect to universes from Lumira teams on Crystal Server

 

Licensing – you can purchase Lumira Edge – team server & BI Platform

 

Q: When can we view Mobile Dashboards without going through the BI app?

A: working on, no timeframe

 

Q: is broadcasting of Design Studio reports available?

A: not available today

Ability to schedule using the BI platform is on the to do list

 

Q: SAP’s UX strategy says it will converge to Fiori – how reflect in BI platform & client tools?

A: BI platform / client – looking to integrate with Fiori

Lumira & Design Studio started this with a Fiori tile into a Lumira story – working on adding OpenDoc capabilities

More adherence to Fiori design type when working on further solutions including Cloud for Planning

 

Q: What is the future for SAP Infinite Insight?

A: brought together InfiniteInsight with SAP Predictive into SAP Predictive Analytics

 

SAP also announced SAP IT Operations Analytics - see an overview in this PC World article: SAP previews new analytics tools for IT, business users | PCWorld

 

Additionally ASUG has a webcast on this in August - Data Center Intelligence

 

ASUG also has a webcast in September titled "What is coming in BI4.2" - register here

 

Finally, if you have questions about moving from BEx tools to Analysis and Design Studio join ASUG today  - register here

State of the SAP BusinessObjects BI 4.1 Upgrade - June 2015

$
0
0

SAP BusinessObjects Business Intelligence Support Pack 6 is Here!

 

Today, SAP released SAP BusinessObjects Business Intelligence 4.1 Support Pack 6 to the SAP Support Portal, both as a full build and as a patch to previous versions. Support Pack 6 has something we haven't seen from a support pack in a long time— new features! Christian Ah-Soon, SAP Product Expert, has written a great summary here on the SAP Community Network (see related article, SAP BI 4.1 SP6 - What's New in Web Intelligence and Semantic Layer). Web Intelligence users will no doubt put document-level input controls to great use. There's small yet significant usability improvements. For example, Export Data functionality has been added to the Reading mode (previously, you had to remember to go to Design mode for that feature). There's improvements to Microsoft Excel data providers. And while I'm not a huge fan of Free-Hand SQL (see related article on my personal blog, Free-Hand SQL Isn't Free), I'm thankful that SAP has closed yet another Web Intelligence feature gap with Desktop Intelligence. And if you're a Live Office fan (don't be ashamed), you'll be glad to know that Live Office has not only been given UNX universe access in BI 4.1 SP6, but the product also has a road map and a future (see related SCN article SAP BusinessObjects BI 4.1 SP06 What's New by Merlijn Ekkel for a comprehensive overview of what's coming to the entire platform). I've barely scratched the surface here, so please read Christian's and Merlijn's much more detailed articles.

 

BI 4.1 SP6 is the last support pack released in 2015. Read that sentence again, I'll wait... Support for XI 3.1 and BI 4.0 ends on December 31, 2015 and it is unlikely that BI 4.2 will be generally available by that time (although it might be in ramp-up, cross your fingers). This means that BI 4.1 SP6 is going to be the go-to release of BI 4.1 for the foreseeable future. And with just a bit of nostalgia, the article that you're reading now will likely be the last "State of the BusinessObjects BI4 Upgrade" I'll write this year (check out the State of the BusinessObjects BI 4 Upgrade archive on the EV Technologies web site). Tomorrow morning- before the first cup of coffee is finished- I'll begin helping a customer download the 4.1 SP6 full build for their XI 3.1 migration kickoff. And I've already downloaded the SP6 patch to apply to one of our internal sandbox servers tonight.

 

You are no doubt wondering if BI 4.1 SP6 is a stable release. And I am, too. I'd be lying if I said that BI 4.1 and its first five support packs were completely pain free. Let's hope that the product quality is just as impressive as the new features.

 

SAP Lumira v1.25 for the BI Platform - Now with Free Sizing Guide!


The big deal at last month's SAP SAPPHIRE was the release of SAP Lumira v1.25, which brought the first iteration of integration with the BI 4.1 platform. I've been lucky to follow Lumira v1.25  from a special SAP Partner training program to its Customer Validation program and finally to its general availability. Release 1.25 brings SAP Lumira from the desktop to the BI 4.1 platform without the requirement for SAP HANA, a stumbling block for a significant number of BI platform customers. But until today, Lumira for the BI platform was missing a critical component- sizing guidelines. SAP has published an updated SAP Lumira Sizing Guide to the SAP Community Network that includes sizing for the BI 4.1 add-on. The add-on brings the same in-memory database engine to the BI 4.1 platform that SAP introduced to the Lumira Desktop in version 1.23 a few weeks ago.


Time to Start Migrating!


The software and documentation released today, combined with the SAP Lumira v1.25 and Design Studio 1.5 software that was released last month (see related article, State of the SAP BusinessObjects BI 4.1 Upgrade - May 2015 (SAPPHIRE Edition)), bring all of the pieces together to take your BI landscape into the future. I hope that these pieces and their installation will be more tightly integrated in BI 4.2. But for me, as well as many of you, the adventure begins tomorrow. Just as soon as all of the software is downloaded.

 

More to come...

SAP BusinessObjects BI4.1 SP06 Released

$
0
0

Dear All,

 

we are pleased to announce that SAP BusinessObjects BI4.1 SP06 has been released and is available for download on http://support.sap.com

 

Additional recourses on SAP BusinessObjects BI4.1 SP06:

 

 

* requires logon to SAP Support Page with a valid account

 

Regards

Merlijn

Data source support for Universe Design tool (UDT) and Multi Source Universe (MSU)

$
0
0

I have got multiple queries from various forums around how we are planning to support UDT and MSU connectivity in future. I have compiled below, our approach towards these going forward – from both BI 4.1 SPs as well as upcoming BI 4.2 release perspective.


Universe Designer Tool (UDT):

As you all know, UDT is used for creating new UNVs based on various supported data sources. Starting from SAP BI 4.x, we additionally ship Information Designer Tool (IDT) as part of the SAP BI product suite, which helps users to create multi-dimensional universes namely UNXs. IDT and UNX combination is forward looking and have advanced features/enhancements.

 

While users can open UNVs from IDT and convert existing UNVs to UNX format, user can continue to use UDT for creating new UNVs on the supported Data sources.

 

However, going forward in BI 4.x releases:

  • We will continue support UDT mandatorily for DBs/Sources which are supported in BI 3.1 version, to make sure there is no regression in the upgrade scenario
  • Newer versions of these DBs (if introduced by the vendor) will be tested and certified for UDT.
  • UDT will not be certified for the new data sources which got/being introduced newly in BI 4.x release.   
  • Current / Latest status of UDT support is upto-date and can be found here in Product Availability Matrix (PAM) - under UNV column.

 

Example:

Customer is using Oracle 10g as database for his UNV created using UDT as part of BI 3.1/BI 4.0.

  • Future Oracle versions will continue to be supported in UDT (as part of BI 4.1 / BI 4.2) – so that customer can seamlessly migrate.
  • Data sources which are/will be new in BI 4.1 / BI 4.2 (like Hadoop Hive, Amazon Redshift etc.) – will not have UDT support.

 

Multi Source Universe (MSU):

Going forward MSU will be tested and certified against Top 6 DB / datasources only, including SAP HANA and SAP BW (Teradata 15, Oracle, MSSQL, Progress, SAP HANA, SAP BW).

For other datasources, MSU support will only be considered based on a business case or customer request.  We will add the support with a justified request – through FPs or SPs, based on the priority.

Current status of MSU support for various data sources are upto-date in Product Availability Matrix (PAM).


Building a customer focused BI Application on SAP BusinessObjects part1

$
0
0

Most of the BI Landscapes in industry utilize a content driven BI Approach rather than a user focused BI approach. While the content centrist approach is great for IT or IS organization it posed challenges to Business as business have to juggle through a lot of content which can be dashboards , reports and explorer information spaces or any other BI contents to do the analysis. This can lead to a lot of frustration and confusion and in the process and also wastes a lot of time of business to get all the relevant information for a specific analysis. Also when a new business user wants to do the same analysis the path he might take can be time consuming as he might need to understand which BI contents available for an analysis and what type of information they have and then switch between those contents to reach to an answer.



To overcome this problem we came up with a novel way to build user focused BI utilizing custom websites with embedded BI contents. Now before going there you might argue why anyone would need one more website when we already have BI launchpad in BusinessObjects as the default portal. And the answer is quite simply BI Launchpad can have multiple type of content like reports (Webi/Crystal) , Dashboards , data exploration Information space and they are most of the time just sitting in different folder and sub folders and there is no logical way to tie them to a specific type of activity or users and the process can be very cumbersome. Also some times the contents are not linked together for example there could be a sales dashboard and sales detail report but user have to go to sales dashboard find out the scenario which he wants to analyze and then go to the reports and select all prompts and filters to get to the details for that scenario.



How this solution works from a bird’s eye view: The most critical feature to make the solution work is open document URL for specific BI contents and enabling single sign on for BusinessObjects. The solution leverage the Opendoc links of Business Objects contents and combining it with i-frames in a customer portal. The portal being rendered via an IIS website which has a user friendly DNS alias. Let’s say user can access all the relevant sales information tying http://salesvs. http://businessobjects-dev(followed by a bunch of clicks to get to your desired folder), which one makes more sense and easier to remember when you are looking for all BI Contents related to sales? We created the sites and named them as http://Sales, short, meaningful and easier to remember for users. The IIS websites make use of i-frames within which the open document links for dashboards, explorer information spaces and Webi reports are called. Also we make sure to make the website code in a way that it loads the dashboard contents while loading utilizing the parallel processing without wasting user time and once loaded the dashboard does not refreshed automatically.

 

Let take an Example:

Let’s take a fictitious scenario; assume you are a product manager in a large organization selling products to consumers across to globe and you are assigned to some product line in the company. Your job requires to ensure your have enough inventory for next week for your top selling products for last quarter for in North American region and ensure the plants which supplied the product is going to produce enough of them for the next quarter.

 

In a traditional content driven BI scenario you would have go to sales folder and find out which reports or dashboard gives you the top customer for last quarter by region. Then find out which is your top selling product for North America by filtering your product lines and regions. Then after you find the product, you would need to go to inventory folder and find out which report or dashboard shows the current inventory by product. Then find what is the current inventory levels for your top product which you have got from sales report. Then go to forecast report find out the forecast of the product for the next quarter and then then compare the number with the current inventory to understand how much of the product you would need to produce during next quarter.This whole process can take many hours to get a answer.

 

Now let’s take the scenario in this new approach where there is dedicated web link like http://PM-Analyticswhich has the sales dashboard with inventory dashboard and Forecasting report at the same Weblink as different tabs. The user just goes into the sales tab , finds the top selling product then gets to the next tab which inventory while still preserving his sales analysis.Then he finds the inventory numbers and goes to the next tab which is forecast report filtering the product and compare the additional inventory that will be needed based on forecast. Sounds simple!! This process will also save user a lot of headache to find the right content use the contents correctly as everything needed in one place and his sales analysis is not lost and he has just do the similar analysis for south America region quite easily as his old analysis does not automatically reset to default and the session should be still active. This process should be no more than few minutes.

 

 

How does it Look:

In a traditional content focused BI user have to go to Launch Pad , Public Folder and then find all the contents that are needed for an analysis.

Bi Launchpad 2.png

 

In the new Process just need to Type a URL in a browser which can be as simple as http://saleswhichallows the user to directly view the landing dashboard without the hassles of finding it in a folder and all the additional BI contents to support an analysis. They do not see anything else except what they need.



Geo1.jpg

Inventory.jpg


The application can have reports which support analysis and also explorer information spaces to do data exploration.

When users wanted another set of related data they just click on another tab which takes him for additional analysis.


report.png

 

I am going to have another post to discuss on all the technical details and possibly some source code to achieve this in your environments.

A Hadoop data lab project on Raspberry Pi - Part 4/4

$
0
0

Carsten Mönning and Waldemar Schiller


Part 1 - Single node Hadoop on Raspberry Pi 2 Model B (~120 mins), http://bit.ly/1dqm8yO

Part 2 - Hive on Hadoop (~40 mins), http://bit.ly/1Biq7Ta

Part 3 - Hive access with SAP Lumira (~30mins), http://bit.ly/1cbPz68
Part 4 - A Hadoop cluster on Raspberry Pi 2 Model B(s) (~45mins)


Part 4 - A Hadoop cluster on Raspberry Pi 2 Model B(s) (~45mins)


In Parts 1-3 of this blog series, we worked our way towards a single node Hadoop and Hive implementation on a Raspberry Pi 2 Model B showcasing a simple word count processing example with the help of HiveQL on the Hive command line and via a standard SQL layer over Hive/Hadoop in the form of the Apache Hive connector of the SAP Lumira desktop trial edition. The single node Hadoop/Hive setup represented just another SAP Lumira data source allowing us to observe the actual SAP Lumira-Hive server interaction in the background.


This final part of the series will go full-circle by showing how to move from the single node to a multi-node Raspberry Pi Hadoop setup. We will restrict ourselves to introducing a second node only, the principle naturally extending to three or more nodes.

HiveServices5.jpg


Master node configuration


"Node1" will be set up as the master node with "node2" representing a single slave node 'only'. To keep things nice and easy, we will 'hard-code' the node IP settings in the local hosts files instead of setting up a proper DNS service. That is, when, for example, using the leafpad text editor, sudo leafpad /etc/hosts and modify the master node hosts file as follows:


     192.168.0.110     node1

     192.168.0.111     node2


Remember in this context that we edited the/etc/network/interfaces text file of node1 in Part 1 in such a way that the local ethernet settings for eth0were set to the static IP address192.168.0.110. Thus, the master node IP address in the hosts file above needs to reflect this specific IP address setting.


Similarly, edit the file /opt/hadoop/etc/hadoop/masters to indicate which host will be operating as master node (here: node1) by simply adding a single line consisting of the entry node1. Note that in the case of older Hadoop versions, you need to set up the masters file in /opt/hadoop/conf.


You may remember from Part 1 of the series that the Hadoop configuration files are not held globally, i.e. each node in an Hadoop cluster holds its own set of configuration files which need to be kept in sync by the administrator using, for example, rsync. In Part 1, we configured the Hadoop system for operation in pseudodistributed mode. This time round we need to modify the relevant configuration files for operation in truly distributed mode by referring to the master node determined in the hosts file above (here: node1).

 

core-site.xml

Common configuration settings for Hadoop Core.


hdfs-site.xml

Configuration settings for HDFS daemons:
The namenode, the secondary namenode and the datanodes.


mapred-site.xml

General configuration settings for MapReduce
daemons
. Since we are running MapReduce using YARN, the MapReduce jobtracker and tasktrackers are replaced with a single resource manager running on the namenode.

 

File: core-site.XML - Change the host name from localhost to node1

  <configuration>

    <property>

      <name>hadoop.tmp.dir</name>

      <value>/hdfs/tmp</value>

    </property>

    <property>

      <name>fs.default.name</name>

      <value>hdfs://node1:54310</value>

    </property>

  </configuration>

Configuration_2.png
File: hdfs-site.xml - Update the replication factor from 1 to 2

    

  <configuration>

     <property>

          <name>dfs.replication</name>

          <value>2</value>

     </property>

  </configuration>

 

File: mapred-site.xml.template ( “mapred-site.xml”, if dealing with older Hadoop versions) - Change the host name from localhost to node1
  <configuration>

     <property>

          <name>mapred.job.tracker</name>

          <value>node1:54311</value>

     </property>

  </configuration>

Configuration.png

Assuming that you worked your way through Parts 1-3 with the specific Raspberry Pi device that you are now turning into the master node, you need to delete its HDFS storage, i.e.: sudorm -rf /hdfs/tmp/*


This already completes the master node configuration.

 

Slave node configuration


When planning to setup a proper Hadoop cluster consisting of considerably more than two Raspberry Pis, you may want to use a SD card cloning programme such as Win32 Disk Imager download | SourceForge.net to copy the node1 configuration above onto the future slave nodes. For each of these clones, modify the interfaces and hosts file, as described above, by replacing the node1 entries with the corresponding clone host name.


Alternatively and assuming that the Java environment, i.e. both the Java run-time environment and the JAVA_HOME environment variable, is already set up on the relevant node as decribed in Part 1, use rsync for distributing the node1 configuration to the other nodes in your local Hadoop network. More specifically, on the slave node (here: node2) run the following command:


     sudo rsync -avxP /usr/local/hadoop/ hduser@node2:/usr/local/hadoop/


This way the files in the hadoop directory of the master node are distributed automatically to the hadoop folder of the slave node. When dealing with a two-node setup as described here, however, you may simply want to work your way through Part 1 for node2. Having already done so in the case of node1, you are likely to find this pretty easy-going.


Modify the file /opt/hadoop/etc/hadoop/slaves as follows to configure the nodes for slave services by simply adding the list of host IDs, for example (here: two-node Hadoop cluster setup):


     node1
     node2


The public SSH key generated in Part 1 of this blog series and stored in id_rsa.pub (and then appended to the list of SSH authorised keys in the file authorized_keys) on the master node needs to be shared with all slave nodes to allow for seamless, password-less node communication between master and slaves. Therefore, add ~/.ssh/id_rsa.pub from the master node, node1, to ~/.ssh/authorized_keys on slave node node2. Switch to the hduser on the master node via su hduser and ssh node2. You should have password-less access to the slave node.


Cluster launch


Format the Hadoop file system and launch both the file system, i.e. namenode, datanodes and secondary namenode, and the YARN resource manager services on node1, i.e.:


     hadoop namenode -format


     /opt/hadoop/sbin/start-dfs.sh

     /opt/hadoop/sbin/start-yarn.sh


When dealing with an older Hadoop version using the original map reduce service, the start services to be used read /opt/hadoop/bin/start-dfs.sh and /opt/hadoop/bin/start-mapred.sh, respectively.


To verify that the Hadoop cluster daemons are running ok, launch the jps command on the master node. You should be presented with a list of services such as both namenode and secondary namenode as well as datanode on the master node and datanode on the slave nodes.


Also, as described in Part 1, you may want to check the setup at http://node1:50030. Similarly, http://node1:50070 will provide you with details on your HDFS. If you find yourself in need for issue diagnostics at any point, consult the log4j.log file in the Hadoop installation directory /logs first. If preferred, you can separate the log files from the Hadoop installation directory by setting a new log directory in HADOOP_LOG_DIR and adding it to script hadoop-env.sh.


And this is really pretty much all there is to it. We hope that this four-part blog series helped to take some of the mystery out of the Hadoop world for you and that this Lab project demonstrated how easily and cheaply a, admittedly simple, "Big Data" setup can be implemented on truly commodity hardware such as Rapsberry Pis. We shall have a look at combining this setup with the world of Data Virtualization and, possibly, Open Data in the not-too-distant future.

 

Links

A Hadoop data lab project on Raspberry Pi - Part 1/4 - http://bit.ly/1dqm8yO
A Hadoop data lab project on Raspberry Pi - Part 2/4 - http://bit.ly/1Biq7Ta

A Hadoop data lab project on Raspberry Pi - Part 3/4 - http://bit.ly/1cbPz68

Jonas Widriksson blog - http://www.widriksson.com/raspberry-pi-hadoop-cluster/

 

References

[1] T. White, "Hadoop: The Definitive Guide", 3rd edition, O'Reilly, USA, 2012

Most basic SAP BI 4.1 SP4 full build installation on Linux

$
0
0

Hi Folks,

 

 

This post is for reference who want to install BI 4.1 SP4 full build on Linux and have never seen similar installation.

Purpose of this post is to make ourselves aware of very simple i.e basic standalone installation with no add-ons, no custom database, no cluster etc.   

 

Prerequisites:

Working RHEL. Host file entries done for itself.

Important packages preinstalled according to install guide.

Follow PAM document  (supported platforms document) before installation.

 

Steps:

Quick Check: Created directory bi42sp4p1 which will be used as installation directory.

1.png

 

2.png

3.png

5.png

6.png7.png8.png9.png10.png11.png12.png13.png14.png15.png16.png17.png18.png19.png20.png21.png22.png

24.png

25.png

 

More Reference:

SAP Business Intelligence Platform Pattern Books - Business Intelligence (BusinessObjects) - SCN Wiki

 

 

That's all folks.

 

 

Regards,

Onkar

Unlock the Auditing database with a new Universe and Web Intelligence Documents for BI4.1

$
0
0

An understanding for how your BI Platform is used and utilised will enable you as a BI Platform Administrator to take the necessary steps to improve its reliability, performance and adoption within your organisation.

 

The Auditing database coupled with a new comprehensive Universe and a set of Web Intelligence documents that I have developed will help give you that insight you need and this is what I'd like to share with you now.

 

My Universe and documents have been in development, on and off, for some time but they have now reached a maturity level where I’m happy to share them with a wider community.

 

I’m overall pretty happy with the Universe and the documents, however they need a little performance testing on large data sets. This is where you can help me, help you!

 

Please download my latest ‘build’ (available for a limited time) and give them a blast. They are provided ‘as is’. I’m looking for feedback on any defects, performance issues and also additional reporting/business requirements. If you can get back to me with your feedback I can improve the content for everyone else to benefit.  I may occasionally published a newer ‘build’ in the same container, so check every now and then for an update.

 

Once I’m happy with the amount of feedback and testing I will make the Universe and documents more widely and permanently available.

 

I have ported the universe to various databases and is currently available for:

  • SAP HANA
  • Microsoft SQL Server
  • Oracle
  • SQL Anywhere

Feedback on which database I should next port to would be helpful too!

 

There’s a large set of documents, each with a number of ‘reports’. The number of reports ranges from 1 to over 50 within a single document. So you can see I’ve been busy! They will take you some time to go through them all.

 

Here’s a list of documents:

1.     STA1 - Start here - Events over time.wid

2.     FRA1 - Fraud Detection - 1 machine more than 1 user.wid

3.     FRA2 - Fraud Detection - 1 machine more with multiple logon failures.wid

4.     LIC1 - License - 1 user more than 1 machine.wid

5.     LIC2 - License - Periods when sessions exceeded X.wid

6.     LIC3 - License - Users no longer using the system.wid

7.     SYS1 - System - Event Log.wid

8.     SYS2 - System - Delay in Recording of events to Audit Database.wid

9.     SYS3 x1 - System - Overall System Load Analysis (without Mode).wid

          SYS3 x2 mi - System - Overall System Load Analysis (Mode is Interactive Only).wid

          SYS3 x2 ms - System - Overall System Load Analysis (Mode is Scheduled Only).wid

          SYS3 x4 - System - Overall System Load Analysis inc Mode.wid

10.     SYS4 x1 - System -  Refresh Analysis (Mode is Interactive).wid

          SYS4 x1 - System - Refresh Analysis (Mode is Scheduled).wid

          SYS4 x2 - System - Refresh Analysis (inc Mode).wid

11.     USA1 x1 - Usage - Session Analysis.wid

          USA1 x15 u - Usage - Session Analysis (With Users, Without Mode).wid

          USA1 x2 mI - Usage - Session Analysis (Mode is Interactive Only).wid

          USA1 x2 mS - Usage - Session Analysis (Mode is Scheduled Only).wid

          USA1 x30 umI - Usage - Session Analysis (With Users) (Mode is Interactive Only).wid

          USA1 x30 umS - Usage - Session Analysis (With Users) (Mode is Scheduled Only).wid

          USA1 x4 m - Usage - Session Analysis (With Mode).wid

12.     USA2 - Usage - Large number of Data Providers.wid

13.     USA3 - Usage - Documents no longer used in the system.wid

14.     USA4 - Usage - Universe Objects usage. Identify infrequent used objects.wid

15.     USA5 - Usage - Universes no longer used.wid

 

Each document has an ‘About’ page that provides a few more details on its purpose.

The Universe is, of course, documented within itself. Every description box has a description! However I’ve not yet written supporting documentation for either the universe or the Web Intelligence documents. Feedback from you on what I should explain would be great!

 

Requirements: BI Platform BI 4.1Support Pack 5 or greater.

 

Instructions

  1. Download the content.
  2. Import one of the four 'Universe' LCMBIAR files into your system using Promotion Management (it will go into "BI Platform Auditing" folder)
  3. Import the Web Intelligence LCMBIAR file (it will go into "BI Platform Auditing" folder)
  4. Edit the connection that is imported (in "BI Platform Auditing" folder) with the correct login credentials.
  5. Open the Web Intelligence document ‘STA1 - Start here - Events over time.wid’ as your starting point!

 

Please post your feedback here and I will do my best to comment back as soon as possible. (I’m on annual leave 24th July until 17th August 2015 so I won’t be able to reply during this time)

 

Matthew Shaw

@MattShaw_on_BI

Fingerprints for SFTP Destinations in SAP BusinessObjects BI Platform 4.1

$
0
0

SFTP Destination support is one of the more interesting new features introduced with the recently released SAP BusinessObjects BI Platform 4.1 Support Pack 6.

 

Quite a lot of customer requests for this one, and it's finally here!

 

When you send or schedule a document to a SFTP destination, you will be asked to enter a fingerprint value.

 

  • What is a fingerprint?
  • Why is it important?
  • How do you determine the fingerprint?

 

I'll answer these questions in this blog. Additionally, I'll describe how I set up a simple environment that I've used for internal testing and teaching purposes for the SFTP feature.

 

SSH File Transfer Protocol (SFTP) Fingerprint

 

SFTP uses Secure Shell (SSH) to send files securely over the network. It's a full-fledged transfer and file management system that uses public-private key cryptography to ensure any client may send a file to a server securely.


Sometimes it's confused with FTP Secure (FTPS) or Simple FTP, but they're not compatible. FTPS is FTP over SSL and Simple FTP has no security features built in.

 

Why the need for secure file transfer?

 

I'll give the most often sited analogy, to snail mail. Say your company needs to send letters to a bank. You put it in an envelope, address the envelope, and drop it off at your company's mailroom. The clerk hands it over to the postman for delivery to the bank.

 

But let's say the clerk happens to be not-above-board. He steams open the envelope and reads the contents, and uses the information found within for private gain. Your letter is compromised. The clerk puts the letter back in the envelope, seals it, and sends it on its way, no-one the wiser.

 

To prevent that, the bank mails you special envelopes. Anyone can put contents into the envelope, but only the bank can open the envelope without destroying the contents. The shady clerk's now thwarted and would no longer be able to read the contents and steal the information.

 

But say the clerk's pretty crafty. He knows that the bank envelopes are delivered through his mailroom. So he waylays the package when it comes in. Instead, he has a set of those special envelopes made for himself, that only he can open, and forwards those envelopes to you. You can't tell the difference between the clerk's envelope and the bank's and so you put the letter in the clerk's envelope and drop it off at the mailroom. The clerk opens the envelope, reads the letter, steals the information, then puts the letter in one of the bank's envelope, and gives to the postman. Neither you nor the bank are aware that the letter has been compromised.

 

The clerk is called the man-in-the-middle, and the scheme he plays is called the man-in-the-middle attack.

 

To thwart a man-in-the-middle, what the bank will do is place a very unique symbol on its envelopes. This symbol would be extremely difficult for others to duplicate. They then publicly publish what this symbol looks like, allowing you to verify that the special envelopes you have is actually from the bank and not the man-int-the-middle.

 

This symbol is a fingerprint.

 

Fingerprints are extremely difficult to duplicate, since they're computed by hashing the public key, the key used for cryptography.

 

Discover the SFTP Fingerprint that BI Platform Expects

 

Now that you know the importance of a fingerprint, how do you discover the fingerprint needed, when sending/scheduling a document to SFTP?

 

If you use a SFTP client tool such as WinSCP or PuTTY, you'll see that they present a fingerprint value for every SFTP that you connect to. But those fingerprint value won't work with BI Platform. They won't work because the hashing algorithm used is different.

 

Typical client tools use a MD5 hash. BI Platform uses the more secure SHA-1 hash. Because of that, you'll need some other means to get the fingerprint.

 

One way is to let BI Platform tell you. When it connects to a SFTP server, it retrieves the public key and computes the SHA-1 fingerprint from it. If that expected fingerprint does not match the fingerprint you've entered for the SFTP destination parameters, then an error is entered in the trace files. That error line records both the expected and entered fingerprint values. You can use this to get the expected fingerprint. The steps are described in SAP Note 2183131, but I'll describe the steps here as well.

 

Log onto the Central Management Console and enable tracing for the Adaptive Job Server. Log onto BI launch pad, navigate to the public "Web Intelligence Samples" folder, right-click on a WebI document and select from the menu Send->SFTP Location:

 

BI_launchpad_SFTP.JPG

 

Fill out the SFTP Server information, including hostname, port, user name and password. For the fingerprint, just enter a keyword that'll be easy to remember and search for, say FINDTHEFINGERHERE:

 

BI_launchpad_findthefinger.JPG

 

Click Send.  Nothing appears to happen (not even an error dialog box pops up), but the document would not have been sent to the SFTP server.

 

Go to the machine where the BI Platform Adaptive Job Server is running, and navigate to the logging folder for the BI Platform deployment. Find the trace file associated with the Adaptive Job Server Destination Service child process. Open the glf file associated with that Service, and search for the fingerprint keyword you entered above:

 

trace.JPG

 

Here's the line:

 

destination_sftp: exception caught while connecting to sftp server [<hostname>]. Details: [83:89:8c:dd:e8:00:a2:e3:26:63:83:24:47:71:ec:8c:1b:ce:de:25 is admin input.Mis match in fingerprint. i.e hashing server fingerPrint obtained  from serverFINDTHEFINGERHERE]

 

The long sequence of 20 two-digit hex numbers separated by colons is the SHA-1  hash of the public key as received by BI Platform. Enter that value into the FingerPrint box of the Send dialog box:

 

BI_launchpad_finger.JPG

 

and you'll see the document be sent successfully to the SFTP server.

 

Are we done?

 

What if I were to ask you whether the fingerprint above is the one for the SFTP server or a man-in-the-middle between your BI Platform deployment and the SFTP server?

 

You can't tell by looking at the fingerprint value itself, you need some other independent way to validate it. A good way is to contact the SFTP server maintainer, and ask them "Would you provide us, securely, the SHA-1 fingerprint for your SFTP server?" That's actually the best way.

 

But sometimes you encounter Administrators who don't know how to do that. What then?

 

Given the public key, a public key you've gotten from the SFTP server by secure means, you can compute the fingerprint yourself. I'll give instructions to do that.

 

First, let's set  up a trial, simple, SFTP server, so we can see things from the SFTP server side of things.

 

 

Generating the Cryptographic Public Key and Private Key

 

First, generate public and private keys that the SFTP server will use for cryptography. There's various ways to do this, some SFTP server products have their own ways.

 

What I'll use is the popular and common PuTTY tools.

 

Download the PuTTYgen RSA key generation utility from here.

 

It's a fairly easy tool to use. In the "Parameters" section, specify the type and length of key, and click the "Generate" button:

 

PuTTYgen.JPG

 

You'll see that the public key in "OpenSSH format" will be displayed in the text area titled "Public key for pasting into OpenSSH authorized_keys file:" So copy and paste the key into a text file using a text editor, such as Notepad or Notepad++. Save the contents to a file named public_key_openssh.pub. By the way, you see the "Key fingerprint:" value in the above screenshot. Ignore it. That's a MD5 hash fingerprint, not the SHA-1 fingerprint we want.

 

Next go to the menu selection Conversions -> "Export OpenSSH key" to export the private key to a file, that I name private_key.key

 

PuTTYgen_export_key.JPG

 

Why OpenSSH key? It's because I'm going to use a SFTP implementation that expects private keys to be in OpenSSH format. There are other formats, and you'd need to refer to your SFTP server documentation to find out which one, if you're going to be using something different from I.

 

Now that we have the keys, let's set up the SFTP server.

 

 

Setting up the freeFTPd SFTP Server

 

For simplicity, I'll use the open-source freeFTPd implementation of the SFTP server. There are others, but freeFTPd is the one I find is easiest to set up and use.

 

Download and run. First go to the SFTP -> Hostkey page, and specify the private_key.key RSA key you generated previously:

 

freeFTPd_host_key.JPG

 

Then go to the Users page and create a test user. Call it testuser:

 

freeFTPd_users.JPG

 

Now go to the SFTP page and start up the SFTP server, making sure you first set where the SFTP is to store the incoming file in "SFTP root directory" setting:

 

freeFTPd_start.JPG

 

And finally check the Status to ensure the SFTP us running:

 

freeFTPd_status.JPG

 

That's it!

 

Now connect to this SFTP server using instructions given above, and get the fingerprint value that BI Platform expects.  Now, what we want to do is compute the fingerprint directly from the public key file public_key_openssh.pub and verify that the value is correct.

 

 

Use OpenSSL tools to Compute the SHA-1 Fingerprint

 

Let's have a look at the public key file contents (in OpenSSH format):

 

ssh-rsa AAAAB3NzaC1yc2EAAAABJQAAAIEAnx3a1iYFDX4HY8Ysf2hOE1UJwha+rLD0iq82gn3+Lgla3ZzPOTuU4R39yQ5cgtzfvQrUq+NIEVEKrw1Vm3CuYVs/UrCUEhDhYOc4AfzszDGaLPnIIJjrZt9i2TnZ+9OeLakno4bgNntVglr8GbL2tryg+FWTzPGcq9O6O5gnavE=

rsa-key-20150626

 

Now the first line, 'ssh-rsa', specifies that the type of key is RSA, and the last line 'rsa-key-20150626' is merely an optional comment line (I just had PuTTY denote the type and date when I generated it).

 

In between, the gibberish, is the Base64 encoded string value for the public key binary value. What we need to do is extract this value from the file, Base64 decode it to get the binary value back, then generate the SHA-1 Digest for this value (in colon-separated hex 2-digit format).

 

Now, the last step you can do using OpenSSL command-line tools. But if you'd like to make life much easier, you can use command-line tools to accomplish the other two pre-steps.

 

The easiest, if you're not on a Unix machine, is to download Unix tools, the Cygwin toolset. The Cygwin command-line tools contain the textfile manipulation and base64 tools to automate the other steps.  Go to the Cygwin site, and install the tools (the default install won't include the OpenSSL toolset, so make sure you manually select those as well during the installation of Cygwin packages).

 

Now, the way to compute the fingerprint is a single (albeit longish) command-line:

 

openssl_sha1_fingerprint.JPG

 

Breaking down the individual commands on the pipe, the command:

 

    cut -d ' ' -f 2 < public_key_openssh.pub

 

reads the file public_key_openssh.pub, cuts the contents at whitespace, and streams out the second component. Essentially, it's extracting the Base64 encoded public key from the public key file. The command:

 

    base64 -d

 

merely reads the input pipe, base64 decode it, and streams out the binary value. And finally, the command:

 

    openssl dgst -c -sha1

 

uses the OpenSSL tool to compute the SHA-1 Digest from the binary value.

 

As you can see, the fingerprint we compute directly from the public key corresponds to the one BI Platform says it got from the SFTP server.  The public key the BI Platform is using is the one from the SFTP server, and not from the man-in-the-middle.

 

Summary

 

If you require ways to send or schedule BI Platform documents across the network securely, the recommended solution is to upgrade your deployment to BI 4.1 SP6 or higher, and use the new SFTP destination functionality.

 

One quirk is the fingerprint value. This blog describes how to determine the fingerprint value to use, and how to validate the fingerprint for correctness.

 

Hope you find this information useful, and you're able to integrate this new functionality into your BI architecture!

 

 

Ted Ueda has supported SAP BusinessObjects BI Platform and its predecessors for almost 10 years. He still finds fun stuff to play with.

New features coming soon in the SAP BI Platform Support Tool 2.0

$
0
0

logoo.png

Over the past six months we have been hard at work designing and developing a brand new supportability platform for SAP BusinessObjects BI Platform.  This product is the SAP BI Platform Support Tool version 2.0 which is an evolution and follow up of the original BI Platform Support Tool version 1.  Over 2014, we collected feedback from engineers, developers, and customers and implemented as much as possible into the new platform.  I believe with the new version, we will significantly reduce the number of incidents needed, reduce the amount of work for the BI administrator, and considerably reduce the time it takes to resolve support incidents raised to SAP support.

 

In this article, I will share the release schedule and provide details on all of the confirmed features coming soon in the version 2.0 release.


    

BI Support Dashboard


The home view now in version 2.0 is a BI supportability dashboard that brings together all of the resources that a BI administrator needs to support their BI Platform environments.  It is essentially a browser that displays useful SCN RSS feeds, hyperlinks to important maintenance, patching, and documentation, as well as a knowledge base search feature that searches KB Articles, SAP Notes, BOB Forum, and Google Search all simultaneously from a single query.  Once you have logged on using your S-USER account, the S-USER SSO certificate is stored within the BI Platform Support Tool client providing you a quick way to access important support content.


dash1.png

dash22.png

 

    

Landscape Analysis Report


One of the primary features of the product is the Landscape Analysis Report.  The Landscape Analysis Report is the name given to a collection of one or more analysis reports containing information about the BI Platform landscape.  The user can select which types of analysis should be included in the Landscape Analysis Report depending on the type of information that is needed for a particular service or root cause analysis task.

 

LA.png

 

The criteria for including data in the Landscape Analysis Report are:

 

  • Data for an analysis type can be collected in less than 10 minutes
  • Information is useful to be reviewed on a re-occurring basis
  • Data can be collected without introducing a large performance hit on the target system
  • Change Analysis and Alerting can be applied to the collected data

 

The generation of a Landscape Analysis Report occurs in two separate phases, Extraction phase, and Report Generation phase.  This separation makes it such that historical report instances can be opened and saved data can be analyzed or compared separately from the actual data extraction.  As a result, it is possible for offline analysis to occur by SAP support or other consulting organizations that cannot connect directly to the live customer system.

 

CVOM Charting

 

We have implemented the same charting engine as used by other SAP BusinessObjects Analytics products such as Lumira and Design Studio.  The CVOM charting engine allows us to visualize more of the system metrics and properties making analysis quicker and more intuitive.


chaar.png

 

New Analysis Types

 

In version 2.0, we have both added new types of analysis and improved the functionality of the existing analysis that existed in version 1.x.  Refer to the table below for a list of the analysis types and information about that analysis.


 

Analysis TypeDescriptionData Source
1.pngServer and Services

 

Information about BI server configuration, settings, and metrics.  The configuration is also displayed in a side by side comparison report for quickly spotting differences in server settings or command line properties

Coarsegrain
2.pngContentDisplays information regarding the count of Info Objects in the system.  This is useful for understanding which products are in use and how large the InfoStore repository isInfoStore
3.pngSchedulePulls back scheduled instances and does analysis on why reports are failing, which instances are taking up most disk space, most common error messages for failed instances, and longest running instances.  You can also now add a date filter to view only the instances you need to analyzeInfoStore
4.pngLicense KeyAnalyzes the current keycodes in use and gives alerts if the keycode will expire soon or if there is missing functionalityInfoStore
5.pngPlatform SearchConfirms that best practices are being followed concerning the Platform Search feature. This is a common reason for performance degradation if not optimally configuredInfoStore
6.pngHardware SummaryInvokes the SAP Host Agent and returns information about the host and operating system for each node in the BI landscapeSAP Host Agent
7.pngAuthenticationDisplays information about the third party authentication setup and single sign onInfoStore
8.pngSemanticShows which Universes and Connections are being used the most.  Displays how many reports will be affected by changes to these semantic layer objects (UNX, UNV).  Checks for orphaned Webi documents (those without a linked universe)InfoStore
9.pngWeb Application ServerConnects to the Java Application Server and shows information and metrics about the Java Virtual Machine as well as the application server settings and configurationJMX
10.pngPatch HistoryCollects from each BI node the installation and patch history.  This is useful to see which patches have been applied, what order they were applied, who installed the patch, was it an install, uninstall, repair, etcSAP Host Agent

 

   

New Custom Alerting Framework


alle.pngOne limitation of the previous version is that all the alerts and thresholds were static and configured at development time.  In version 2.0, we have made a new extensible alert framework that allows the expansion and customization of the metrics and settings that are evaluated.  Additionally, the threshold values and logic used to trigger the alerts can also be customized to better align with the needs of a particular organization or environment.


The alerts themselves are evaluated during the extraction phase so that the triggered alerts are stored within landscape xml itself.  This way, if you are reporting on the landscape xml outside of the SAP BI Platform Support Tool or if you are sending the landscape xml to SAP for analysis, the alerts that were triggered at extraction time will always be able to be recalled and viewed in a historical manner.


There is some new alert terminology to be aware of in the 2.0 platform.


Simple Alerts - Allow user customization, changes to thresholds, delete or add new metrics.  Simple alerts are limited to evaluation of one metric/setting and one logic operator


System Alerts - These are system defined alerts which allow for more advanced logic and analysis.  System alerts include things such as keycodes expiring in the next 30, 60, or 90 days, nodes not at the same install patch level, or nodes not running the same support pack


Complex Alerts -Complex alerts are alerts which allow you to combine the results of two or more alerts and allow the use of AND and OR logic to determine the alert state.  Complex Alerts are not available yet in version 2.0 and are scheduled to be implemented in the next version (version 2.1)


Alert Definitions - Alerts are configured via the preferences UI and are stored in the file alerts.xml under the BI Platform Support Tool /resources/Alerting directory

alertsss.png

       

   
Alert Summary Tab

 

Any simple or system alerts triggered in the Landscape Analysis will appear on the Alert Summary Tab.  This makes it possible to quickly review which alerts were triggered in a particular analysis so that actions may be taken where necessary.  The Alert Summary tab also contains information such as the user who ran the Landscape Analysis, how long the processing took, and what version of the SAP BI Platform Support Tool was used to generate the analysis.


Aller.png

   

  

Improved E2E Trace Wizard

 

One of the most common activities required for root cause analysis of a BI landscape is to generate an End to End Trace.  Using the included SAP Client Plug-in, each request sent from the BI client contains an SAP Passport which is intercepted by the application server and passed along to the backend processing servers and databases.  This feature automates this process by automatically configuring the BO_trace.ini on each BI node, recording a video capture of the trace session, and collecting the log files from each host in the landscape for the user.

 

e2e.png

 

The E2E Trace Wizard relies on the existing Landscape Definition to understand which BI nodes and Application Server nodes are defined for the target landscape.  For each BI node and Application Server node, the user has a choice of whether they want to use the SAP Host Agent or UNC Shared Directories to collect log files from the remote hosts.  This allows flexibility for cases where the customer cannot run a SAP Host Agent or share network folders on a particular node type.  After the logs are collected, the content is zipped up and stored in the E2E working directory of the BI Platform Support Tool user directory.  The user may then forward this required tracing information and video capture to SAP for quick problem resolution or code correction.  Once the trace session is complete, the E2E Trace Wizard will revert the BO_trace.ini back to default settings on each node in the landscape.

 

    

Change Analysis 2.0


A useful technique in troubleshooting the SAP BI Platform is to understand what changes have been made in the BI system as some changes may lead to performance problems and/or system crashes.  The Change Analysis feature builds on the existing Landscape Analysis Report feature and allows the end user to select two or more Landscape Analysis Report Instances for comparison.

 

Changh1.png

 

When compared, each data landscape XML is loaded into memory of the BI Platform Support Tool.  Each property name, property value pair is compared using a comparator and when differences are found, the differences are displayed in the client in columnar format.  Values determined to be different are highlighted in yellow for quick and easy identification.  Although useful for identifying changes to the system, the Change Analysis tool can also be used to view a change in performance metrics over a period time (for example, Memory Usage on Saturday each week).


Changeanalysis.png



a1.png

3rd Party Authentication Wizards

 

The procedure for setting up third party authentication and single sign on (SSO) tends to generate a lot of incidents and can be a fairly complex set of procedures.  This process requires the administrator to read the manual and follow instructions very closely for success.  Additionally, differences in environments can make understanding the setup guide a bit difficult since it is not tailored for their particular landscape.  This is where the Authentication Wizard comes in.  This wizard guides the BI administrator through every step of the process while customizing the setup depending on their own domain, LDAP, or SAP environment.  Furthermore, it even authors emails for the BI administrator to send to their domain administrators with instructions on the steps that need to be taken on the domain controller, BW system, LDAP server, etc.  This wizard is truly like having SAP Support helping you without ever needing to create an incident.


 

  

     

Landscape Toolbox


The new Landscape Tools section contains a number of Diagnostic Tools which are mostly used by SAP Support for specific troubleshooting tasks. This area is mainly for smaller applications that do not fit the criteria required for the Landscape Analysis Report.  Applications in this area are usually tools that have existed in support in the form of JSP pages or smaller Java console applications.  Refer to the table below for details on the included tools:


LTool.png



Publish Landscape to SAP (Reverse 911 Alerting)

   

thumbnail.jpg

Predictive Maintenance is a big initiative here at SAP Active Global Support and to help facilitate a more pro-active support service, we have built into version 2.0 the ability to safely and securely publish your Landscape Analysis Report directly to SAP.  If you choose to participate, your landscape XML is consumed on an internal SAP HANA system where a variety of analytics can automatically check for problems such as:

 

  • Landscapes not following best practice or not within PAM recommendations
  • Systems where tracing was accidentally not disabled
  • Systems that may be vulnerable to a new security vulnerability that was discovered
  • Systems that may contain a setting that was recently discovered to introduce a problem
  • Landscapes running a patch or SP that may contain a regression

 

The goal of this functionality is to identify problems before they occur and pro-actively reach out via email to those customers and administrators who may be affected by the problem or situation.

 

r911-2.png

 

 

Release Dates and Beta

 

The SAP BI Platform Support Tool 2.0 will be released for free as an official product on the SAP Store.  Since it is an official product, we are subject to SAP Product Standards and as a result, the release has taken a bit longer than originally expected.  Prior to release to customer, we will be having a beta release.   We are in the final stages of the release process now and plan to release the beta in August 2015.  Release to customer will follow in September / October time frame after the beta program wraps up.

SAP BusinessObjects BI Platform Product Roadmap Notes

$
0
0

This was an SAP webcast given by Maheshwar Singh, SAP, this month.  Below are my notes.  Note that things in the future are subject to change.

1fig.png

Figure 1: Source: SAP

 

Figure 1 shows the legal disclaimer that things in the future are subject to change.

2fig.png

Figure 2: Source: SAP

 

Figure 2 covers self-service BI on BI Platform as defined by SAP – this was not covered in the BI Platform roadmap.  BI Launchpad includes save, share manage content and self-service on BI Platform

3fig.png

Figure 3: Source: SAP

 

Schedules are subject to change

 

Purpose of roadmap is to give you short term and long term goals

 

Figure 3 shows the roadmap in three sections – today, planned innovations in 12-18 months, future innovation – forward looking

 

Today is BI4.1 SP6

4fig.png

Figure 4: Source: SAP

 

Integration kits are no longer a separate add-on; key capabilities are included

 

LCM console is integrated in CMC as promotion and version management

 

Planned Innovations

 

Planned innovations are in 12-18 months and anything could change based priorities

5fig.png

Figure 5: Source: SAP

 

As shown in Figure 5 plans include add-on audit improvements to include all clients – Lumira, Analysis Office, Design Studio.  For audit samples see Matt Shaw's post Unlock the Auditing database with a new Universe and Web Intelligence Documents for BI4.1

 

Admin broadcasting messages idea came from Idea Place.  For example, the admin send message to all users, some users, or certain user – downtime that IT plans.  Can also send by e-mail if e-mails are available for users.  It is also available as an alert

 

Recycle bin is a top-voted idea in Idea Place.  They plan to make it like Windows – CMC – admin can restore content – first focus is on public folders.  The admin defaults clean-up date; admin can change.  In first iteration it is not available for universe connections and personal documents

Admin cockpit is one central page to get information about what is happening with your system

6fig.png

Figure 6: Source: SAP

 

Migration improvements for the Upgrade Management Tool UI have an option to change log level instead of changing the INI file

 

Promotion management – selective retrieval from LCMBIAR – top voted idea

 

Performance improvements around promotion management

 

Manage Inbox – they are thinking – manage users’ personal space – today cannot restrict what user stores in personal space

 

BI Commentary – they plan to do – collaboration today with SAP Jam today

  • Context to your comment
  • Available as part of the platform so it is centrally stored
  • Plan is to start with Web Intelligence and then Design Studio / Analysis Office
  • Plan is to see the same content across all reporting tools

7fig.png

Figure 7 Source: SAP

 

Today the install takes up a lot of time; want to look at reduce install start up screen time for patches

 

Simplify language update support experience

8fig.png

Figure: 8 Source: SAP

 

Multitenancy was introduced in 4.1 with more improvements shown above

 

Promote content between tenant 1 and tenant 2

9fig.png

Figure 9 Source: SAP

 

Publications are planned for Analysis Office

 

Looking at scheduling to a printer as an idea

10fig.png

Figure: 10 Source: SAP

 

RESTful web services new things will be supported in BI4.2

 

Future Direction

11fig.png

Figure: 11 Source: SAP

 

Quick time to value with deployment and automation of software maintenance

 

In TCO reduction, SAP is looking at CMC has several applications; look at consolidation and making BI admin simple

 

Roadmaps are updated twice a year

 

Related:

Upcoming ASUG Webcasts:

What's new in BI4.1 SP06

What is new in BI4.2

Unprecedented Visibility: Bringing BI Auditing and Monitoring to your Mobile Device

 

Upcoming ASUG Business Intelligence Community W... | ASUG


Migration form BOXIR3 to BI4.1SP6

$
0
0

I want to share some of my findings on this mission 'possible - on the way from good old Deski to new brave WebI.

 

Still the customer users are fond of Deski and will miss the tool.

 

"Can we  set the timeout for 8 hours?" - - a user asks - since in deski you could keep the tool up and running the whole time. Webi in the browser has a timeout set!

We are testing 240 min timeout at the moment.

 

The RCT tool is still a bit  -ahm - could be better - let's phrase it this way:

 

Crucial it seems to run as administrator - which makes it tricky on the windows 2012 server...

Still it does not remember more the one login detail - I'll have to check if there is an ini file or sg.

Some RCT installation will not run properly - don't ask why!

 

Some features are already working but the RCT gives an error: e.g.: do not Update SQL. - gets converted pretty well, but the conversion log reports an error / warning: usupported option!. -

 

Strugling with FHSQL though - oracle works - investigation ongoing how to geht teradata FHSQL to be converted!

 

to be continued

 

Wobi

Securing your BI deployment

$
0
0

I hope everyone is having a nice, relaxing summer.  The Vancouver summer so far has been full of great weather, and I've been enjoying every moment of it.

A question I am seeing frequently from our customers is "how do I secure my BI deployment?" - and for good reason.  The headlines in my RSS reader are still filled with security breaches and data protection incidents, and I don't anticipate that going away any time soon.

 

My colleague Greg Wcislo has written a three-part series on answering this question.

 

Part 1,  securing Identity Provider communication and a review on how the data is stored

Part 2,  configuring the web tier which is likely the most critical if you have your installation exposed to the outside world.

Part 3,  securing the BI servers, including ports, firewalls and database encryption

As a bonus, here's another excellent write up by Greg on Encryption and Data Security in BI 4.0.

 

I strongly recommend implementing HTTPS and CORBA SSL in your deployment, along with having a regular password expiration for your users, use complex passwords and regularly reviewing authorizations in your BI system, even if the web application is not public-facing.

In addition, don't forget about SAP's security note portal.  It's located here:

https://support.sap.com/securitynotes

 

Other links of interest:

A BOBI Document Dashboard with Raspberry Pi

$
0
0

Carsten Mönning and Waldemar Schiller


In this blog post, we present a pretty straightforward way of setting up an automatically refreshing BusinessObjects Business Intelligence document dashboard with the help of a conventional Raspberry Pi 2 Model B unit. You may take this as an inspiration for a lab project along the lines of "A Hadoop Data Lab Project on Raspberry Pi", http://bit.ly/1dqm8yO. However, the setup is robust and simple enough to maintain to use it, for example, for the operation of a war room terminal showing an automatically refreshing Web Intelligence key figure report within a BusinessObjects Business Intelligence production environment.

 

The basic idea is to force a database refresh of a BusinessObjects Business Intelligence document referenced via a standard SAP OpenDocument URL which is reloaded automatically with the help of an auto reload addon to the Debian "Iceweasel" web browser. The OpenDocument URL will represent the browser's landing page. For this to be more than a totally meaningless exercise, it is assumed that the data source for the Business Intelligence document is updated at least as frequently as the web browser's landing page. We have been using this setup in context with the SAP CRM embedded Business Warehouse out-of-the-box "real-time" info providers configured to be updated at 10 minute intervals thereby ending up with a report dashboard of 10 minute accuracy using standard SAP technology (and a Raspberry Pi). With the browser and its landing page set to launch automatically upon Raspberry Pi boot up, this setup can be turned into something like a 'plug-and-play' solution for straightforward BusinessObjects document dashboard implementations.

 

We are assuming a basic knowledge of Linux commands. The installation and configuration process should take no more than 45 minutes in total.

 

Preliminaries

  • The following Raspberry Pi 2 Model B bits and pieces and required to get things off the ground:
  • A Raspberry Pi 2 Model B (quadcore CPU, 1 GB RAM).
  • 8 GB microSD with NOOBS ("New Out-of-the-Box Software") installer/boot loader pre-installed.
  • Wireless LAN USB card.
  • Mini USB power supply, heat sinks and HDMI display cable.
  • Optional, but recommended: A case to hold the Raspberry circuit board.

 

Rather than purchasing all of these items individually, you may want to go for a Raspberry Pi accessory bundle at approximately € 60-70.

RaspberryPiBundle.png

Setup overview

The installation and configuration process consists of the following three main steps:

 

  1. Raspberry Pi software configuration
  2. Web browser installation and configuration (auto reload plugin and OpenDocument URL landing page)
  3. Autostart and display configuration

 

Raspberry Pi software configuration

Launch your Raspberry Pi device. On the command line, enter sudo raspi-config to start the standard Raspberry Pi software configuration programme and make the following selections:

 

  1. Enable booting into the Raspberry desktop environment.
  2. Overclock the device to the "Pi 2" setting, i.e. "1000 Mhz ARM, 500 Mhz core, 500 Mhz SDRAM, 2 overvolt".

 

BootOption.png

Overclock.png

With the help of setting (1), we will be able to configure the device in such a way that it launches a web browser immediately following the completion of its boot up sequence, whilst setting (2) simply makes full use of the remarkably powerful processing capabilities of the Raspberry Pi 2 Model B.


Web browser installation and configuration

Establish a LAN or wireless internet connection for your Raspberry device and download and install the "Iceweasel" web browser, the Debian distribution's fork from the Mozilla Firefox browser, https://wiki.debian.org/Iceweasel (and not to be confused with the GNU browser "IceCat", formerly known as "IceWeasel"):

 

     sudo apt-get install iceweasel


Following successful "Iceweasel" implementation, install any auto reload plugin for this web browser, for example, "Reload Every" at http://reloadevery.mozdev.org.


With both the web browser and the auto reload addon in place, it is left to the set the browser landing page to the SAP BusinessObjects Business Intelligence (BI) document you want to display and to get database-refreshed automatically in regular intervals within the browser. This is where the SAP OpenDocument URL functionality comes in handy, https://help.sap.com/businessobject/product_guides/sbo41/en/sbo41_opendocument_en.pdf.


OpenDocument comes with your BusinessObjects BI platform installation in the form of a deployed web application. (The web bundle is part of the BOE.war file.) What it does is to process incoming URL requests for BusinessObjects BI documents in the BusinessObjects Central Management Server, and delivers the document to the end user. The supported document types include, amongst other things, Web Intelligence documents, Analysis workspaces, Dashboard objects and Crystal reports.


The OpenDocument default URL syntax reads as follows:


     http://<servername>:<port>/BOE/OpenDocument/opendoc/<platformSpecific>?<parameter1>&<parameter2>&...&<parameterN>


where <platformSpecific> is to be replaced with openDocument.jsp in the case of a Java deployment or with openDocument.aspx in the case of a .NET SAP BusinessObjects BI deployment. Note that there are not to be any spaces around the ampersands joining the parameters.


Refer to the SAP help document referenced above for the various OpenDocument parameters available. For our purpose of automatic database refreshing, the sRefresh parameter is the parameter of choice and is used, for example, as follows:

 

     http://<servername<:<port>/BOE/OpenDocument/opendoc/openDocument.jsp?iDocID=Aa6GrrM79cRAmaOSMGoadKI&sIDType=CUID&sRefresh=Y

 

In other words, the BusinessObjects document with the CUID, i.e., cluster unique ID, Aa6GrrM79cRAmaOSMGoadKI will undergo a database refresh each time it is opened via this OpenDocument URL.

 

Set the browser landing page to whatever document of your BusinessObjects BI deployment you would like to refer to using the above OpenDocument syntax. (This assumes, of course, that your Raspberry Pi device is granted the necessary network access privileges to resolve the OpenDocument URL.) You may want to finish this configuration step by setting the browser to full screen mode by pressing the F11 key.


Autostart and display configuration

Launch a Raspberry Pi terminal session and navigate to the autostart folder via


     cd ~/.config/autostart


Create the new file iceweasel.desktop with the contents as shown below.
Autostart.png

That is:

     [Desktop Entry]

     Type=Application

     Name=iceweasel

     Exec=iceweasel

     StartupNotify=false

     Terminal=false

     Hidden=false       


Finally, prevent your display from ending up in power save mode adding the line xserver-command=X -s 0 -dpms to file /etc/lightcm/lightdm.conf.

MonitorPowerSave.png

And that's pretty much it. Plug in your Raspberry Pi in the dashboard display of choice, restart the Raspberry device and the boot up process should result in your BusinessObjects document automatically getting shown and refreshed within the "Iceweasel" web browser. You may set the refresh frequency, as required, with the help of the web browser's reload tab and that's it.


Links

A Hadoop Data Lab Project on Raspberry Pi,http://bit.ly/1dqm8yO.

Cooking up an Office Dashboard Pi - https://gocardless.com/blog/raspberry-pi-metric-dashboards

Iceweasel - https://wiki.debian.org/Iceweasel

"Reload Every" auto reload addon - http://reloadevery.mozdev.org

Viewing Documents Using OpenDocument - https://help.sap.com/businessobject/product_guides/sbo41/en/sbo41_opendocument_en.pdf


Migration from Crystal reports to BO in few steps

$
0
0

Are you looking for visualization to strengthen your decision making?

 

Are you looking for predictive analytics to build strong economic growth for your organization?

 

Are you concerned about budget to convert crystal reports to BO?

 

Is timeline, duration & resources cost bothering you for migrating crystal reports to BO?

 

Brief Summary

Reporting today is every industry needs , every industry demands analytical reports to make decision on fly, drill down reports to get the details of cost factors, comparison reports to know my competitors, google maps with embedded graphs to know sales geographically.




         B2.png                                                                                                                                        B1.png


Most important is report to be available on web, mobile devices & tablets.

Today is the world of Business intelligence, where every industry wants to increase their sales, reduce the cost & improve operations.  Accessing heap of information in less time and present intelligently with visual modelling is what BI tool does.

There are many BI tools in market one of the most successful tool is Business Objects, An object which does great magic with visualization which are beyond imagination.

Business object is the company now owned by SAP. 

Business object is the next generation tool which provides many intelligence ways of presenting information on web, tablets & mobile phones on finger tips.

BI is not just limited to crystal reports, it’s been extended using BO with powerful & intelligence analytic's. 

 

 

 

Almost all industries uses crystal report for reporting, crystal report has very limited analytic's to present data and limited to only desktop users.

There are many customers worldwide for crystal report, some has 100-1000 reports already developed, developing crystal report with all formulae and rules is very exhaustive work & redoing it for visualization, extensiblity with another tool is like pain in the neck.

Crystal report to BO conversion is another project implementation, all reports developed in crystal again developing them in BO is like building another Rome, but as i said we can make things simple.

 

Learn in my next follow up blog about crystal report to BO migration.

 

 

 

B3.png

Free Hands-On Upgrade Course for BI4 (in Germany)

$
0
0

If you are a German speaking customer planning to upgrade from BOE XIr2 or XI3, to BI 4.1 - then you could hugely benefit from a *free* 2-day hands-on workshop.

 

RIG Analytics can help steer your through the key planning considerations & upgrade tasks that are required for this kind of migration project.

This course will be held in Walldorf, Germany, on 1st and 2nd September 2015. Registration is simple

 

Customer Feedback, from the 20 previous workshops, is overwhelmingly positive! Examples from our recent United Kingdom, Spain, & South Africa workshops include:

  • "Upgrading is a lot less intimidating now"
  • "Amazing how much knowledge was shared in 2 days"
  • "The best session I ever received from SAP"

For more information please contact your local SAP representative.

Viewing all 317 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>