Quantcast
Channel: BI Platform
Viewing all 317 articles
Browse latest View live

SAP BusinessObjects Business Intelligence Suite 4.1 SP05 (SP5) Released

$
0
0

 

 

Announcement

 

SAP has released on Monday November 17th 2014, as planned in the Maintenance Schedule, Support Package 05 for the following products:


  • SBOP BI Platform 4.1 SP05 Server
  • SBOP BI Platform 4.1 SP05 Client Tools
  • SBOP BI Platform 4.1 SP05 Live Office
  • SBOP BI Platform 4.1 SP05 Crystal Reports for Enterprise
  • SBOP BI Platform 4.1 SP05 Integration for SharePoint
  • SBOP BI Platform 4.1 SP05 NET SDK Runtime
  • SAP BusinessObjects Dashboards 4.1 SP05
  • SAP BusinessObjects Explorer 4.1 SP05
  • SAP Crystal Server 2013 SP05
  • SAP Crystal Reports 2013 SP05

 

This comes five months after the release of SAP BI 4.1 SP4 (SP04) back in June 2014.

 

You can download these updates from the SAP Marketplace as a Full Install Package or Support Package (Update).

 

E.g.: Full Install

new.png

 

E.g.: Support Package (Update)

update.png

 

 

What's New?

 

The updated What's New document has been released early this time on 06/11/2014.  This is a good read, there are few good new features in this update but the ones that are significant to me are:

 

  • SAP Lumira integration with the BI Platform
  • Free-hand SQL for Web Intelligence via the SDK and UI Extension Points*

 

* I was told at SAP TechEd that a sample will be made available to help us doing this.  This new feature is expected to be out of the box with no SDK required with SP06.  I'll post a link of the sample here when I find it.

 

 

There are tons of fixes (356 to be exact).

 

 

Supported Platform (Product Availability Matrix)

 

The SAP BusinessObjects BI 4.1 Supported Platforms (PAM) document has not been released yet (last update is still Sept 19th, 2014)

 

Alternative URL: Product Availability Matrix | SAP Support Portal

 

 

Documentation

 

The usual documents have been made available:

 

 

 

 

 

 

 

 

 

* These SAP Notes are not yet released.  Use the associated URL in the meantime.

 

 

Forward Fit Plan

 

SAP is no longer updating the SBOP Forward Fit Plan so I'm unable to confirm for the moment which updates are included here...  One would hope it's as it used to be and will include SAP BI 4.1 SP04 Path 3...


To be confirmed...



Maintenance Schedule


SAP BI 4.1 SP05 Patch 5.1 (Week 51 - December 2014)

SAP BI 4.1 SP05 Patch 5.2 (Week 4 - January 2015)

 

SAP BI 4.1 SP06 is scheduled to be released late July 2015 (Week 30 2015).

 

Source: Schedules for Support Packages and Stacks | SAP Support Portal

 

 

Installing Updates


This training server has a clean installation of SP04 with the English language only installed.  This is how long it tool to install everything.


Note: For those who have read my previous post about the release of SAP BI 4.1 SP04, the timings below will seem much quicker.  The reason for this is that I'm using a training server with the same specs but different pre-installations of SAP software...  Or SAP have made things a lot quicker now!


 

Updates

 

    • SBOP BI Platform 4.1 SP05 Server
    • SBOP BI Platform 4.1 SP05 Client Tools
    • SAP BusinessObjects Explorer 4.1 SP05

 

Environment

 

    • Windows Server 2012
    • 4x Virtual Processors (Hyper-V)
    • 20 GB RAM

 

Duration

 

1. As always, the Preparing to install screen takes a while...  about 6-7 minutes for me.

 

Please wait.png


2. This chart shows the time it took waiting for the Preparing screen to disappear then the installation time.


install time.png


3. As always, when you click Finish, do NOT reboot straight away.  Wait for setupengine.exe to go away in your Task Manager.  This can take a minute or so.

 

Task Manager.png

 

 

Past Articles

 

For information, I wrote the following articles about previous SAP BI Support Packages:

 

 

 

Conclusion

 

It's still early days and there are couple of documents that need to be updated but looking forward to have a look at the Free-hand SQL in Web Intelligence!

 

As always, please share your how it went for you in the comments below.  I'm sure this does help many people.

 

 

Please like and rate this article if you find it useful!

 

 

Thanks and good luck!

 

Patrick


Understanding the "Health State" in the BI Platform Monitoring application and repairing failed Health State Watches

$
0
0

One of the key concepts to understand in the BI Platform Monitoring Application is the Health State metric.  A number of different aspects of the monitoring application rely on the Health State metric and without a clear understanding of how this is supposed to work, it makes effective monitoring and troubleshooting the application a frustrating task.  In this article, I will describe the concept of Health State in great detail and I will also describe how to correct the Health State in your BI Platform Monitoring application.

 

 

Health State Metric

 

In the Central Management Console, when creating a new watch, that there are two types of Health State metrics.


  • Server Health State - The Server Health State indicates the health of a particular server.  This metric can be used to understand whether the server is up and running, whether the server is overloaded, and whether the server is still able to take additional requests.  The Health State of the server indicates to the BI administrator that they need to take action to troubleshoot a problem on that particular server

 

_bb.png



  • Topology Health State - The Topology Health State indicates the cumulative health of all servers of a particular type (Categories health) and also all servers in a particular server group.  The Service Categories include CrystalReports, Analysis Services, Dashboard Services, Promotion Management Services, Core Services, Explorer Services, Connectivity Services, and SIA nodes


_aa.png

 

 

 

How the value for Health State is determined

 

In the case of the Server Health State metric, the value is determined by the result that particular server's watch.  Anytime you create a new server manually or use the System Configuration Wizard to create your Adaptive Processing Server configuration, the system will automatically create a new watch for each server using the nomenclature of NODENAME.SERVERNAME Watch.  This is a "system" created watch and cannot be manually deleted.  You may have noticed in the Central Management Console that the system created Server Watches are also displayed for ease of access under CMC -> Home -> Servers --> Servers List.

 

_serverslist.png

 

Health State Evaluation


Depending on value returned by the server's watch formula, the server health will display one of the following five states.


 

STATEDEFINITION
GREENServer health is good and no action is necessary
AMBERServer is slightly overloaded, nearing peak values as defined by the caution rule
REDServer resources are over used, unable to take new requests, or the server is stopped or disabled
DISABLEDThe watch is marked as disabled in the BI Monitoring application.  Select the watch and click the enable button to re-enable the evaluation of this watch
FAILEDThere is an error in the watch formula or the BI Monitoring service is disabled or not running


Topology and Categories Health States

 

In order to provide the BI administrator a quick path to troubleshoot issues in the BI Platform landscape, the server health states are aggregated into service category health states.  This makes it much more simple to tell if any particular product type is available for the end users that are using the system.  For example, if your BI system mainly processes Crystal Report view-on-demand requests, then it is vital in order to achieve maximum up-time that all the Crystal Reports Processing Servers in the BI landscape are available to process these jobs.  The Crystal Reports category health state depends on the aggregated health state of all the Crystal Reports server watches.  This can be seen by editing the Crystal Reports category watch formula where you will find in the formula the health state of all Crystal Reports servers.

 

_watt.png

 

In the case of the Crystal Reports category, all of the servers required to process Crystal Reports are grouped together in the topology map so that you can tell at a glance which server watch may be causing the overall category state to change.

 

_crr.png

 

Fixing the Overall Health Watch and the Health State Hierarchy

 

On the BI Platform Monitoring Dashboard, there is an Overall Health state indicator (also known as the Consolidated Health Watch).  You may have noticed that this is quite often not showing a valid state (Green, Amber, or Red) and instead is giving a state of Failed.  In order to fix this, it is important to understand how this particular Health State is determined, then make the necessary underlying watch formula corrections that this watch is dependent on.  In the monitoring application, there is a large hierarchy of Health State watches and if any of these dependent watches is broken or invalid, the Overall Health will show a state of Failed.  In order to help the BI Administrator to correct their BI Platform Monitoring application and Overall Health, I have created a diagram showing each level in the Overall Health hierarchy which you can use to track down the broken watches and correct the formula. 


In this example, you can see that the Overall Health state is Failed. 


__her.png


 

If any of the dependent Health Watches below the Consolidated Health Watch are failed, then the watch in the next level up will also be failed.  Therefore, you must start at the bottom of the hierarchy and correct this watch.  In this example, the server APS 2 has a failed watch, therefore the SIA Node 2 watch is failed, the Enterprise Nodes watch is failed, and so on.

 

__her2.png

 

After correcting the APS 2 Health State watch formula, all of the parent watches are now also showing a correct value and the Overall Health is Green (OK).  Note that, after you correct the child watch formula, wait for a few minutes as there is a metric refresh internal of 60 seconds (by default) where the Monitoring Service will update the status of all watches in the system.  In otherwords, the change in Overall Health will not happen immediately after correcting the dependent watches so be patient.

 

__her33.png

 

Repairing the Server Watch formulas

 

When creating a new server or using the System Configuration Wizard, you will find that the automatic routine that handles this is not perfect and depending on which service you are created, the automatically generated system watch may contain either the wrong server name reference, in some cases (such as the Connection Server), the wrong metric altogether.  When you edit the watch's danger rule or caution rule you will see in red, the erroneous contents in the formula that needs to be corrected.

 

A server Health State watch should contain at the very least a check to make sure the server is running.  Depending on the granularity that you desire you can create a two state watch, or a three state watch.

 

____.png

 

 

 

If you want to see a yellow caution state when a server is stopping and starting then you should use a three state watch, if you are only interested in seeing green state for running and red for any other state, you can use a two state watch.  Using the server metric Server Running State, you can easily create a new server watch based on whether that server is available or not.

 

 

Server Running State Values

 

State

Value

Stopped

0

Starting

1

Initializing

2

Running

3

Stopping

4

Failed

5

Running With Errors

6

Running With Warnings

7

 

 

See below an example of both two state and three state watches that check for server availability.  In this example, my SIA node name is NODE and the server name is SERVERNAME.

 

 

Two state watch formula:

 

Danger RuleNODE.SERVERNAME$'Server Running State‘!=3

 

Three state watch formula:

 

Caution RuleNODE.SERVERNAME$'Server Running State'==1 || NODE.SERVERNAME$'Server Running State'==2 || NODE.SERVERNAME$'Server Running State'==4 || NODE.SERVERNAME$'Server Running State'==6 || NODE.SERVERNAME$'Server Running State'==7
Danger RuleNODE.SERVERNAME$'Server Running State'==0 || NODE.SERVERNAME$'Server Running State'==5

 

 

Factoring in performance to the server health state

 

In some cases such as the Central Management Server, the load on the CMS server is used to determine the server health state.  Depending on which type of server you are editing the watch for, there are a variety of different metrics that can be used to determine load.  You may want to also include in your server watch formula some thresholds for these metrics so that the server health state metric is dependent also on how well the service is performing and whether it is able to take on more jobs.


Refer to the BI Platform Administrator Guide for more information on server metrics to determine which metrics are suitable for your BI landscape.

How to Create a Program File in BI4

$
0
0

I've written a bit of jsp, vbscript, and java based applications in the past and lately I've really enjoyed writing simple java apps that can be run as a Program File (.jar) within the BI 4.x environment.  A couple examples of these Program Files were shared earlier this year called the 'biUserSessionKillScript' and 'DisableInactiveUsers'.

 

How to Delete Stale BI4 User Sessions with biUserSessionKillScript

How to Auto-Disable Inactive Users in BI4

 

In BI 4.x, Program Files can be in the form of a batch file, vbscript, javascript, shell script, or a jar file.  The examples above were compiled as .jar files and below I'll explain what it takes to create your own java Program File.  I've created a template that can be imported into Eclipse to help get you started.  The source code is attached to a Kbase which can be downloaded separately here:

 

SAP Note 2099941

 

In this example, its assumed you know how to import a java project into Eclipse and how to export it to a jar file.

 

The What and Why:


Developing a BI4 Program File has many benefits.

  • For one, your application will be executed by the AdaptiveJobServer which means it will look to the BI4 java classpath used by the AJS.  There's no need to package all the extra dependent jar's as you would a standalone application.  Your import statements will automatically look for and find these libraries in the "C:\Program Files (x86)\SAP BusinessObjects\SAP BusinessObjects Enterprise XI 4.0\java\lib" directory assuming you are only using the BI SDK based libraries.
  • Second, a Program File offers all the benefits of historical instances.  But instead of having a Webi or CR report as your scheduled instance, a .txt file is generated in the Output FRS location with all of the System.out.println()information within your program.  This can be beneficial for both debugging your application as well as giving a historical look at whatever your application did when it was scheduled.
  • Lastly, since the program is stored, executed and managed within the BI system, the security for running the application can be set just as any other infoobject.

 

Attached at the bottom of this article is an Eclipse project template for creating a new BI4 program file.  Feel free to use this as a starting point for creating your own Program File.  Below is an explanation and breakdown of how this application template works.

 

How it works:

 

In the top portion of the example you'll find the class definition and main method.  The difficulty in writing a new program file is the application needs to be able to run standalone in order for you to develop it, alter it and debug it within Eclipse.  When the application is run as a program file within BI, the standalone portion of this application will never be executed.  In other words, the main() method is never actually called.  Instead, the Adaptive Job Server calls the run() method directly.

 

The example starts off with the class definition and the main()method.  The class implements the IProgramBase interface.  This is a requirement for all BI4 program file applications because of how the program job server service calls the run() method directly.

image1.jpg

The majority of the main() method code does nothing more than create an enterprise session and pass it to the run() method.  Be careful of what you include in your main()method as this will only be executed when your application is run outside of BI (within Eclipse, via command line or any other means).  All code you want executed when the application runs within BI must be inside the run() method or in another method called from this point down.  Again, think of the run() method as your starting point.

 

In order to run the application from the command line in standalone mode, you will need to create an enterprise session.  This requires 4 command line parameters passed to your application.  The code below first ensures that 4 values were entered.  It compensates for a 5th value in case something else needs to be passed, but this isn't necessary for the code below to work.

 

image2.jpg

There's not much going on here in the run()method.  In this example, its assigning the username within the enterprise session to a variable and then writing it to the console.  However, every time you output text to console via System.out.println() the Adaptive Job Server will write this to the text file created in the Output FRS when the program file is scheduled.  There's no need for additional code to output your information to a text file.  This is all built into BI and automatically done at schedule time.

 

image3.jpg

 

Although this really doesn't do much, keep in mind its just an example.  The program will run as-is and output to console the user name held in the enterprise session.  To test this within the Eclipse IDE, you will need to create a Run Configuration.

 

  1. To do this, right click on the project name within the Package Explorer, choose Run-As > Run Configurations.
  2. Highlight Java Application on the left and click the 'new launch configuration' button.
  3. Under the Main tab:
    • The project name should be auto filled in if you had right clicked on the correct package within the Package Explorer.  If its not, then browse for it and select the project.
    • You'll need to search and select the main class in the 2nd box.  Click the Search button to do this and select the ProgramObjectTemplate classname.
  4. Under the Arguments tab:
    • Add the 4 command line arguments separated by a space.
    • The values specified in the main() method are:  <username> <password> <cms name> <auth type>
  5. Once the Run Configuration is created and saved, you can run the application within the Eclipse IDE and test it.  Once tested, then you'll be ready to export it to jar and import the jar into the Central Management Console.

 

image5.jpg

image7.jpg

Import the Program File into the CMC


To import the Program File into the CMC, you'll need to follow these steps:


  1. Log into the CMC
  2. Navigate to a folder where you want to import the file. Right click on the folder, choose Add > Program File
  3. Choose Browse and select the .jar file you exported from Eclipse
  4. Select the Java radio button and click OK
  5. You will now see the new Program File within the folder.
  6. Before you can schedule it, you'll need to set the default Program Parameters.  To do this, right click on the Program File, and select Default Settings> Program Parameters
  7. In the Class to Run box, type the name of the class - ProgramObjectTemplate.
  8. If there were any additional command line parameters after the first 4 we created (user, password, cmsname and authtype), you would set these in the Arguments box. The BI4 Java SDK will automatically create a user session at schedule time based on the user that schedules the program file.  So passing these here isn't required.  The only arguments you would need in the Arguments box are the additional command line parameters passed beyond these 4.  See either of the examples mentioned at the beginning of this blog as both use an additional command line argument.


image8.jpg

image9.jpg

How to Auto-Disable Inactive Users in BI4

$
0
0

Over the past few years, I've been asked to create or assist in creating various scripts to automate a task in SAP Business Intelligence 4.x (BI4).  Most of these scripts were loose files in the form of .jsp or .vbs that had to be run manually.  Lately, I've taken a personal liking for Program Files which are uploaded and stored within the BI environment.  The last request was for a script that would disable any user that hadn't logged into the CMS within the past 90 days.  Rather than hard-coding the 90 days into the program file, I created it to take the number of days as an argument (or input parameter).  And if the argument is set to 0, then the script will only display the user info without committing any changes.

 

If you're interested in creating your own Program File and want more information on how to do this yourself, I wrote another blog titled How to Create a Program File in BI4.  The blog describes the steps and requirements for creating a java program in Eclipse as well as provides a project template to start with.

 

Description

 

The following script can be used to disable users in bulk that have not logged into the BI system for the past X number of days.  This is equivilant to logging into the CMC manually, navigating to a user's property page and clicking the 'Account is Disabled' checkbox under the 'Attribute Binding' section.

 

Where to Download


The source code and .lcmbiar file can be downloaded from the following SAP Note#:  http://service.sap.com/sap/support/notes/2097401

 


How to import the script into BI 4.x

 

The script can be imported into your BI environment using Promotion Management.  The zip downloaded from the kbase has the .lcbiar file within it.  Follow the steps below to import it using Promotion Management.

 

  1. Log into the Central Management Console (CMC) as Administrator
  2. From the home page, click on Promotion Management
  3. Click on the Import dropdown menu and choose the option, Import File
  4. The Import from File dialog box appears.  Ensure the 'File System' radio button is selected and click on 'Browse'
  5. Unzip the .lcmbiar file you previously downloaded.  In this box, navigate to the .lcmbiar file and click OK.
  6. There will be a 'new job' tab opened with some of the information from the .lcmbiar file automatically filled out (such as Name, Description, etc).  Select the Destination menu and choose the CMS you'd like to import the Program File into.  You will be prompted to provide permissions for this CMS.
  7. Click Create.
  8. Now that the import job is created, you will need to run the promotion.  Click on the Promote button in the toolbar.
  9. The summary page opens showing exactly which objects will be promoted.  You should see 2 items.  A folder object and a Program object.  Click the Promote button at the bottom of this page.
  10. When the job is finished running, the Status column will show 'Success' as a result.  The job should take less than a minute to run, so if you don't see it succeed within a few minutes, make sure you have an Adaptive Job Server running that contains a Promotion Management Scheduling Service.

 

Running the Program File

 

Once the .lcmbiar file is imported (see the steps above), you will see a folder called "Admin Scripts" under the top level root folder.  Be sure to set permissions on this folder accordingly so its not accessible for non-Admin users.  To schedule, follow the steps below:

 

  1. Navigate to the 'Admin Scripts' folder and right click on the Program File underneath called 'DisableInactiveUsers' and choose Properties.
  2. Under the 'Schedule' option on the left side, choose the Program Parameters option.
  3. Set the Arguments field to the number of days since last logon.  This value must be a positive integer.  ( In other words, 10.5 days will not work. )
  4. Click Schedule.

 

 

RECOMMENDATION:  The first time you run the program file, run it with the argument of 0.  This will give a full output for all users and how recently each user has logged in.  The output can be read as-is in text or easily copied into Excel and sorted as needed.  Once you know the value of days old you want to run the job under, set the 'Argument' value accordingly.

 

Other Info

 

As with any script, there can be unexpected results.  Test the script first in a lower level environment such as a Test or Development system.  Be sure to have a backup of your CMS system database in case you need to quickly revert the changes.  And lastly, if you'd like to see the source code, the Eclipse project files are included in the zip file attached to the same SAP Note.

 

The Arguments field by default is set to 0.  This value represents the minimum number of days since last successful login.  The script will query for the SI_LASTLOGONTIME value stored in the CMS repository for each user.  It will determine how many days ago this SI_LASTLOGONTIME was (based on current time where the Adaptive Job Server is running).  Any user that has not logged in within this number of days will be disabled when the script is run except for one condition.  Any user that has never logged at least once will not have a SI_LASTLOGONTIME property and thus will be left unaltered. For these users, the output file will show the equivalent of the Java Date(0) value which is either 1969 or 1970 depending on your locale settings. 

 

If you leave the days value as 0, then the script will not commit any changes.  In this case it will only output the full user list with their last login date/time and if that user is currently disabled or not.  If any value greater than 0 is used, then the script will attempt to disable any users that match the criteria except for those that have never logged into the CMS at least once.

 

Once the program file is scheduled, clicking on the successful instance within the history window will show the output results similar to viewing a report instance.

 

historywindow.jpg

 

Except for the header info, the data in the output file is in a comma separated value format.  Below is an example of the output file:

 

 

screenshot1.jpg

 

If you have a larger user list, opening the csv within excel can make the data easier to work with.

screenshot2.jpg

Installation BO 4.1 SP5

$
0
0

Hello,

 

 

if you try to install BO 4.1 SP5 as update on existing installation of BO 4.1 SP4 on Windows 2012 Server R2 you get an error after the restart of the system.

 

The BO System can not connect to the CMS system.

 

In CCM, in the properties of Server Intelligence Agent (SIA), on configuration tab, getting the error: Failed to retrieve the cluster name from the database. Reason: Parser failed to initialize. If you try to configer the cms new, the error "FWB 00090" occurs.

 

 

To fix the problem, check if the files sqlrule.llr and sqlrule.dfa are in <BI_install_folder>\SAP Business Objects\SAP BusinessObjects Enterprise XI 4.0\win64_x64.

 

If not, copy the files newest version from a sub-directory of <BI_Install_folder>\Install Data\Install Cache to <BI_install_folder>\SAP Business Objects\SAP BusinessObjects Enterprise XI 4.0\win64_x64

 

Restart the SIA and the problem is solved.

 

Regards

Andreas

Manual Entry in prompts for BW variables in Crystal Reports Enterprise and Viewer

$
0
0

Crystal Reports Enterprise 4.1 SP 05 release is supporting the feature, manual entry in prompts for BW variables. Here I will be mentioning information on supported and not-supported variables and steps to have multi-value field for selection option variable.

Manual entry in prompts is supporting for following variables:

  1. Single value variable
  2. Multi-value variable
  3. Interval( Range) variable
  4. Selection option variable
  5. Formula variable
  6. Single Keydate variable

Manual entry feature is not supported for Hierarchy variables, Hierarchy node variables.

 

  1. The feature is available by default in CRE and BOE Viewer for old (4.1 SP04 or other) and newly created reports.

When you open any report or create a new report, you should have one text box for manual entry with add symbol like below.  When you refresh the report with prompts, the report will display the below dialog with manual entry text box option.

Multiple values can be entered with semicolon separated.

Either you can enter the values manually or can select from the list.

ManulEntry TExt field.png

 

 

  2. To get multi-value selection field for Selection Option variable we have to make an entry in configuration file of Crystal Reports Enterprise installation folder.

Make the below entry in the configuration file (C:\Program Files (x86)\SAP BusinessObjects\Crystal Reports for Enterprise XI 4.0\configuration\config.ini)

Entry to be made:  “sap.sl.bics.variableComplexSelectionMapping=multivalue

 

MultiSelction.png

 

 

  3. In-order to get multi value selection field in Viewer, we have to make an entry in CMC as well.

Entry Location: Login in to CMC -> Servers->Crystal Report Services-> CrystalReportsProcessingServer -> Java Child VM arguements

Entry to be made: “Dsap.sl.bics.variableComplexSelectionMapping=multivalue”

 

 

If the entry is not made in CRE or Viewer then the field appears as Interval or Range field.

Range.png

 

  Hope it helps…

How to speed up connection between Translation Manager, Universe Designer and CMS

$
0
0

At a recent customer project my teammate and I were facing an issue with importing Webi documents and universes into Translation Manager (all on BO4.1 SP4) getting the following error message:

 

org.apache.axis2.AxisFault: Unable to find servers in CMS, servers could be down or disabled by the administrator (FWN 01014)

 

There is a KB article describing this error and a solution: http://service.sap.com/sap/support/notes/1879675

 

One hand we followed the instructions in the KB article and specified the hostname for each server. Although our server is multihomed (that means has more than one network interfaces) we didn't thought about this before hand because the second network interface goes into a backup LAN and is never used for communication with e.g. the client tools. In addition everything worked fine so far - until we got the issue with Translation Manager. Still, just setting the hostname was not enough. On the server side there is the Windows firewall enabled, therefore we had to assign static request ports to several services. We already did so before our issue with the Translation Manager for the CMS, Input & Output FRS, all Webi Processing Server. We used TCPView to analyze on which ports Translation Manager was opening connections. As long as there were requests on ports which we didn't specify in the CMC (some random port number e.g. 56487) we narrowed down all the services which Translation Manager establishes a connection. We had to specify a port for all of the following servers:

  • The Adaptive Processing Server (APS) hosting the DSL-Bridge Service
  • The APS hosting the Client Auditing Proxy Service
  • The APS hosting the Search Index Service
  • The APS hosting the Translation Service
  • The WACS (Web Application Container Server)

 

Besides the issue with Translation Manager we had another issue with creating new universe connections in the Universe Design Tool. When creating a new connection we had to wait up to 30 minutes ( ! ) to get to the wizard page where you can select from the list of available database drivers. Still, after 30 minutes everything worked fine and could create the connection successfully. Based on our experience with Translation Manager we run again TCPView and found that we need to assign a port number to both connection servers (32 and 64 bit) in CMC. Having done this, creating a new connection now works without any waiting time.

 

After all: If you have firewalls between your BO server and client tools, just assign a port to all servers available and open the port on the firewall (the only exception might be for the Event Server as I'm really not aware of any communication between this server and a component outside the BO server)

New Maintenance Dates for SAP BusinessObjects BI4.1

$
0
0
sap-logo.png



The Best-Run Businesses Run SAP


 

275682_l_srgb_s_gl.jpg

Friday, December 5 2014

New Maintenance Dates for SAP BusinessObjects BI4.1


To enable our customer base adopting SAP BusinessObjects BI4.1, SAP is pleased to announce the extension of Maintenance for SAP BusinessObjects BI4.1 with two years of additional support. The End of Maintenance and Priority One dates have been extended as of today!

 

 

In this issue:

 

1. New SAP BusinessObjects BI4.1 End of Maintenance Dates

 

By SAP support standards, the End of Maintenance dates are defined by a 7+2 years support plan for a product line, this resulted in an End of Maintenance for SAP BusinessObjects BI4.1 by December, 31 2016. However, while listening to our customers, SAP learned that the existing End of Maintenance Dates where too short to enable a full adoption of SAP BusinessObjects BI4.1.

To enable all our customers in the adoption of SAP BusinessObjects BI41, SAP has decided to extend the default Mainstream Maintenance with an additional two years.

 

  • End of Mainstream Maintenance for SAP BusinessObjects BI4.1 is now December 31, 2018
  • End of Priority One Support for SAP BusinessObjects BI4.1 is now December 31, 2020


 

ico-vidresult_icon_sm.png

 

SAP BusinessObjects BI4.1 Product Availability Matrix ›

ico-vidresult_icon_sm.png

 

SAP Maintenance Strategy Rules

ico-vidresult_icon_sm.png

 

SAP Mainstream Maintenance Rules ›

ico-vidresult_icon_sm.png

 

SAP BusinessObjects Priority-One Rules ›

 

 

2. Existing SAP BusinessObjects Products Support Overview

 

Existing SAP BusinessObjects Products Support Overview

Although the Mainstream Maintenance dates for SAP BusinessObjects BI4.1 have been extended by 2 year, this is not the case for other SAP BusinessObjects Products. To provide a complete overview of SAP BusinessObjects Products and their End of Mainstream dates the list below is provided.

In case you are currently not running an SAP BusinessObjects BI4.1 release, please validate your current Mainstream Maintenance dates. In case the Mainstream Maintenance dates have been passed or are passing in the near future, it is strongly recommended to upgrade your existing environment to SAP BusinessObjects BI4.1. For details on the Upgrade process, please read:

 

ico-vidresult_icon_sm.png

SAP BusinessObjects Platform BI4.1


End of Mainstream Maintenance -> December, 31 2018

End of Priority One Support -> December, 31 2020

 

ico-vidresult_icon_sm.png

SAP BusinessObjects Platform BI4.0


End of Mainstream Maintenance -> December, 31 2015

End of Priority One Support -> December, 31 2017

 

ico-vidresult_icon_sm.png

SAP BusinessObjects Enterprise XI3.1


End of Mainstream Maintenance -> December, 31 2015

End of Priority One Support -> December, 31 2017

 

ico-vidresult_icon_sm.png

Prior to SAP BusinessObjects Enterprise XI3.1


 

End of Mainstream Maintenance -> Expired

End of Priority One Support -> Expired

 

 

3. Where to find more information about (upgrading to) SAP BusinessObjects BI4.1

 

SAP has been building up a dedicated webpage with a collection of the most valuable information in regards to the SAP BusinessObjects BI4.1 Suite. Wherever you are in your Business Intelligence journey, find resources here that will help your organization be successful with SAP BusinessObjects BI solutions.

 

 

The SAPBUSINESSOBJECTSBI.COM Website will enable to

 

  1. Be Your Organization's BI Champion
    Leverage the resources below to show what's possible with SAP BusinessObjects BI solutions and increase BI excitement within your organization.

  2. Empower Employees with a BI Strategy
    A solid BI strategy helps you get the most from your data assets, technology investments and BI initiatives.

  3. Design and Manage a Successful Implementation
    Address organizational and governance needs as well as the technology portion of your implementation.

  4. Get Personalized Upgrade Advice
    Start planning your upgrade with a personalized guide from an upgrade expert. The report will help you get the most out of your BI deployment.

  5. Advance your skills with BI Academy
    Learn more about BI and SAP BusinessObjects BI solutions.

  6. Events & Webinars
    Find new perspectives, alternative approaches, and spark your BI creativity.

 

 

ico-vidresult_icon_sm.png

 

www.sapbusinessobjectsbi.com


Behind the Scenes with Platform Search in Business Intelligence Platform 4.x

$
0
0

  Most of people have noticed that Platform Search application works differently in Business Intelligence Platform (BI) 4.x comparing to previous release. The architecture of Platform Search has been changed significantly since BI 4.0. It provides scalable and flexible Business Objects content indexing and search infrastructure support for different proprietary BOE content types. It can be set to real time indexing, so that the user is not required to restart the Indexing every time when he wants latest indexing content. When the documents are published/modified/deleted in the repository, the application identifies those documents and they will be indexed. Alternatively, it can be set to schedule based indexing which will trigger the indexing based on the schedule time. In either way, the user can perform searching in BI Launchpad while indexing is happening. Platform Search also supports load balancing and failover for both indexing and searching in a clustered environment.

 

  Platform Search service is the service in the Adaptive Processing Server, which has the logic to index the BOE content and search the content. It uses Apache Lucene, a free open source information retrieval software library from Apache Software Foundation. The version of Apache Lucene currently used by BI 4.0 and BI 4.1 is 2.4.1.

 

  The functionality of the Platform Search service can be divided as Indexing and Searching. Before the content becomes searchable, the content needs to be indexed. In a large sized system with a large number of infoobjects, getting all the infoobjects fully indexed first time can be time consuming because indexing involves several sequential tasks. I will talk about the indexing process in this blog.

 

 

Indexing Process

 

Indexing is a continuous process that involves the following sequential tasks:

 

1.     Use Crawling mechanism to poll the CMS repository and identifies objects that are published, modified, or deleted. It can be done in two ways: continuous and scheduled crawling.

 

2.     Use Extracting mechanism to call the extractors based upon the document type. There is a dedicated extractor for every document type that is available in the repository. There are following extractors:

 

    • Metadata Extractor
    • Crystal Reports Extractor
    • Web Intelligence Extractor
    • Universe Extractor
    • BI Workspace
    • Agnostic Extractor (Microsoft Word/Excel/PPT, Text, RTF, PDF)

     

    3.      Use Indexingmechanism to index all the extracted content through the third-party library, Apache Lucene Engine. The time required for indexing varies, depending on the number of objects in the system, and the size and type of documents. It involves the following steps:

      1. The extracted content will be stored in the local file system (<BI 4 Install folder>\Data\PlatformSearchData\workplace\Temporary Surrogate Files) in an xml format called as Surrogate files.
      2. These surrogate files will be uploaded to Input File Repository Server (FRS) and will be removed from the local file system.
      3. The content of the surrogate files will be read and will be indexed by using specific index Engine into temporary location called as Delta Indexing Area (<BI 4 Install folder>\Data\PlatformSearchData\workplace\DeltaIndexes).
      4. The Delta index will be uploaded to Input FRS and will be deleted from the local file system.
      5. The Delta Index will be read and will be merged into Master Indexed Area (<BI 4 Install folder>\Data\PlatformSearchData\Lucene Index Engine\index) which is the final indexed area in the local file system.

     

             For indexing to run successfully, the following servers must be running and enabled:

     

          • InputFileRepositoryServer
          • OutputFileRepositoryServer
          • CentralManagementServer
          • AdaptiveProcessingServer with Platform search service on
          • AdaptiveJobServer (scheduled crawling)
          • WebIntelligenceProcessingServer (content type is selected as Web Intelligence)
          • CrystalReportApplicationServer (content type is selected as Crystal Reports)

     

     

    4.      Generating Content Store and Speller/Suggestions

             After completing the Indexing task the following things will be generated:

     

          • Content Store: The content store contains information such as id, cuid, name, kind, and instance extracted from the master index in a format that can be read easily. This helps to quicken the search.

    Each AdaptiveProcessingServer creates its own content store (<BI 4 Install folder>\Data\PlatformSearchData\workplace\<NodeName>.AdaptiveProcessingServer\

    ContentStores)

     

     

          • Speller/Suggestions: The similar words will be created from the master indexed data and will be indexed. The speller folder will be created under “Lucene Index Engine” folder (<BI 4 Install folder>\Data\PlatformSearchData\Lucene Index Engine\speller)

     

     

     

     

     

    Platform Search Queues

     

      Internally, above indexing sequential tasks are handled by Platform Search Queues. When Indexing is started, an infoobject would eventually go through the following queues in this order:

    To Be Extracted> Under Extraction> To Be Indexed > Indexing> Delta Index To Be Merged> Content Store Merge


    If multiple Platform Search Services exist, there is only one To Be Extracted, To Be Indexed, Delta Index To Be Merged and Content Store Merge queue for all nodes. But each Platform Search Service has its own Under Extraction Queue and Indexing Queue. Only one Platform Search Service will be designated as the master service to do delta index merge into master index.

     

      Each Platform Search Queue itself is an infoobject, the status of each Platform Search Queue can be retrieved by running the following query in the Query Builder:

    SELECT * FROM CI_INFOOBJECTS,CI_APPOBJECTS,CI_SYSTEMOBJECTS WHERE SI_KIND = 'PlatformSearchQueue'

     

    It will return the results with the following SI_NAMEs:

    • Platform Search (Delta Index To Be Merged) Queue
    • Platform Search (To Be Indexed) Queue
    • Platform Search (To Be Extracted) Queue
    • Platform Search (Exclude Documents) Queue
    • Platform Search (Include Documents) Queue
    • Platform Search Content Store Merge Queue
    • Platform Search (Under Extraction - Enity - AcpzqPRw1thIk_GYPiEETF8)
    • Platform Search (Indexing - Enity - AcpzqPRw1thIk_GYPiEETF8)

     

    You will find a property called SI_PLATFORM_SEARCH_OBJECTS in each queue. That property displays the number of objects being processed in that queue. If SI_TOTAL of that property displays 0, it means that queue is empty.

     

      Exclude Documents and Include Documents are two special Queues to handle the exclude documents. When you update the exclude documents in CMC > Applications > Platform Search Application > Properties > Documents Excluded from Indexing, the documents will be added to the Platform Search
    (Exclude Documents) Queue
    .  When infoobjects are extracted, they will be excluded.


      When you remove the exclude documents in CMC > Applications > Platform Search Application > Properties > Documents Excluded from Indexing, the documents will be removed from exclude documents queue and added to the Platform Search (Include Documents) Queue. The crawling will only add documents to be extracted queue if only there is modification for the infoobject and its content or it is a new infoobject. In the case of those infoobjects removed from the exclude documents, they are neither new infoobject, nor modified, so they won't be picked up by crawling. They are added to this special queue, so that they will be added to the To Be Extracted queue.


      From the Platform Search Queues result, you can see that Under Extraction and Indexing Queues are associated with a Platform Search Service session SI_CUID because each Platform Search Service has its own Under Extraction Queue and Indexing Queue. The information of Platform Search Service Sessions can be retrieved by running the following query in the Query Builder:

    SELECT * FROM CI_INFOOBJECTS,CI_APPOBJECTS,CI_SYSTEMOBJECTS WHERE SI_KIND = 'PlatformSearchServiceSession'

     

    Each Platform Search service should have one session. If the heartbeat (SI_PLATFORM_SEARCH_HEARTBEAT_TIMESTAMP) isn’t updated regularly on one session, other search service would try to return the hung service’s objects to the previous queue and take over unfinished work.


    Here are some other useful queries you can run to get information regarding Platform Search Application.

     

     

    Retrieving the general information about Platform Search Application

    SELECT * FROM CI_INFOOBJECTS,CI_APPOBJECTS,CI_SYSTEMOBJECTS WHERE SI_KIND = 'PlatformSearchApplication'

     

    The property SI_PLATFORM_SEARCH_SERVICE_CONTEXT_ACTION shows if the indexing is running. 0 means Indexing is not running, 1 means Indexing is running.

     

     


    Retrieving the information of Platform Search Application Status

    SELECT * FROM CI_INFOOBJECTS,CI_APPOBJECTS,CI_SYSTEMOBJECTS WHERE SI_KIND = 'PlatformSearchApplicationStatus'

     

    For example, you can check the following properties:

    • SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_DAILY_MAX_OBJECT_ID
    • SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_ID
    • SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_MAX_ID
    • SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_MAX_FOLDER_ID
    • SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_UNIVERSE_ID
    • SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_TIMESTAMP

     

    SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_MAX_IDrepresents the SI_ID of the last infoobject which was added to the To Be Extracted queue. The infoobjects are added to the To Be Extracted queue in the batches. So if we have a batch of 100 infoobjects which are added in the To Be Extracted queue, this field will have the max SI_ID among the SI_IDs of those infoobjects.

     

     

    SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_ID represents the SI_ID of the last infoobject which was added to the To Be Indexed queue. When indexing starts, this field will have the same value as SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_MAX_ID. But during the indexing if some infoobjects didn't get added to the To Be Indexed queue, then this field is updated with the max SI_ID of the infoobjects which actually got added to the To Be Indexed queue. And SI_PLATFORM_SEARCH_LAST_TO_BE_EXTRACTED_MAX_ID field is retained with the original value. For both these fields, the SI_IDs of folders are not included.

     

     

      For the definition of above properties related to Platform Search, please use the latest release of the SAP BI Platform Support Tool. A new report option has been added in BI Platform Support Tool that will provide detailed information on the Platform Search and how it is performing.

     

     

    BISupportTool.png

     

     

     

      I hope this blog helps you to understand how Platform Search Indexing works.

    How to Loadbalance 2 Tomcat instances using Apache Web Server

    $
    0
    0

    Hello All,

     

    In this blog, we will go through the steps necessary in order to configure Apache Web Server as load balancer for Tomcat Application server. This document is helpful for BI 4.0/4.1 as it has Tomcat 6 or 7.

     

    Why to Loadbalance?

     

    To reduce a load on the web applications server alone you can setup a dedicated web server.
    You can use Apache web server  to load balance Tomcat, Weblogic, Websphere or other Web applications servers.

     

    Please go through the Web Application Deployment guide for more details.

     

    Why use Apache web server as a Loadbalancer?

     

    The amount of request and users requires multiple application servers running so that all requests can be handled in timely manner, and to have a safety option if one of the applications server crashes. One of the common approaches in situations like that is to use Apache web server as a load balancer for the web traffic targeted to multiple tomcat applications servers. Because Apache web server is quick and has only one responsibility to forward requests to tomcat servers which do the actual work such as loading data from the database and returning it to the user. It makes sense to use it as a load balancer to easily relieve the pressure on hard worked tomcat instances.

     

    NOTE: This blog is intended to provide the steps to configure Tomcat loadbalancing using Apache. Please note that SAP would not assist in the setup as well as post-configuration steps as this solution involves configuration of 3rd party applications.

     


    Ingredients:


    1. SAP BusinessObjects Business Intelligence platform 4.0
    2. SAP BusinessObjects Business Intelligence platform 4.1
    3. Any OS
    4. Apache 2.x
    5. mod_jk as connector

     

    The following steps outline the required downloads for Apache 2.2:



    1. Download the Apache 2.2 64-bit package from http://www.apachelounge.com/download/win64/

     

    • At the time of writing, the latest version was 2.2.9

     

    2. Download the mod_jk connector binary from http://www.apachelounge.com/download/win64/

     

    • At the time of writing, the latest version was 1.2.40

     

    Extract the included mod_jk.so and place it in C:\Apache2\modules (or whichever directory you extract Apache to) after you have completed the next section.

     

     

    Step 1: Installation of Apache service on Windows server.


    1. Once you have extracted the Apache2.2, copy it to the appropriate drive. Eg: C:\Apache2


    2. Launch command prompt & navigate till location C:\Apache2\bin.

     

    0.JPG


    3. Type below command

     

    httpd -k install -n BOEXI40Apache

     

    1.JPG

     

    • Note that the CCM is configured to render any services labeled with a prefix of BOEXI40.  By doing this we allow it to appear in the CCM for ease of management.

     

    4. Change the display name of the service to include the version of Apache:

     

    sc config BOEXI40Apache displayname= "Apache HTTPD 2.2.29"

     

    3.jpg

     

    • Note that if you have the World Wide Publishing Service (IIS) on the server, it is required that you stop/disable it or ensure it runs on a port other than 80.  You may test that Apache is running properly by accessing http://localhost.  If the server is up and running you should get a simple HTML page with the text, "It works!"

     

     

    Step 2: Configure httpd.conf

     

    1. Stop Apache & Tomcat service.


    2. Navigate to C:\Apache2\conf.


    3. Backup the existing httpd.conf.


    4. Edit httpd.conf & add the below parameters to the end of the file:

     

    # Load module
    LoadModule jk_module modules/mod_jk.so
    # Specify path to worker configuration file
    JkWorkersFile conf/workers.properties
    # Configure logging and memory
    JkShmFile logs/mod_jk.shm
    JkLogFile logs/mod_jk.log
    JkLogLevel info
    # Configure monitoring
    JkMount /jkmanager/* jkstatus
    <Location /jkmanager>
    Order 'deny,allow'
    Deny from all
    Allow from localhost
    </Location>
    # Configure applications
    JkMount /webapp-directory/* LoadBalancer
    JkMount /jsp-examples/* loadbalancer
    JkMount /servlets-examples/* loadbalancer
    JkMount /clusterjsp/* loadbalancer
    JkMount /* loadbalancer
    JkMount /*.jsp loadbalancer

     

    Here's a quick explanation of the parameters we just configured:

     

     

     


    LoadModule


    This command makes the mod_jk module available for use. The extension of the module itself will vary by operating system.


    JkWorkersFile


    Sets the path to the worker configuration file, which we will create in the next step.


    JkShmFile


    Sets the path to the shared memory files for the module. Generally, you'll want to keep this with the logs.


    JkLogFile


    Sets the path to the module log file.


    JkLogLevel


    Sets the level of logging for the module. The valid values for this attribute, in descending order by verbosity, are "debug", "error" or "info".


    JkMount


    This is used to map a certain URL pattern to a specific worker configured in the worker configuration file. Here, we use it twice - once to enable /jkmanager as the access URL for jkstatus, a virtual monitoring worker, and once to map all requests we want to be handled by the cluster to the "lb" worker, a virtual worker that contains the load balancing capability.


    Location


    This is a security constraint. The settings we have included allow access to the jkmanager only from the localhost (this is a Good Idea).

     

     

    Step 3: Configure cluster Workers:

     

    1. Create a workers.properties under C:\Apache2\conf

     

    2. Edit the worker.properties file & copy the below lines

     

    #workers.java_home="C:\Program Files (x86)\Java\jre7"
    worker.list= appserv1, appserv2, loadbalancer
    #define appserv1 instance Tomcat worker
    worker.appserv1.port=8009
    worker.appserv1.host=<ServerName1>
    worker.appserv1.type=ajp13
    worker.appserv1.lbfactor=1
    worker.appserv1.cache_timeout=600
    worker.appserv1.socket_keepalive=1
    worker.appserv1.socket_timeout=300
    #failover node for appserv1
    worker.appserv1.redirect= appserv2
    #define appserv2instance Tomcat worker
    worker.appserv2.port=8009
    worker.appserv2.host=<ServerName2>
    worker.appserv2.type=ajp13
    worker.appserv2.lbfactor=1
    worker.appserv2.cache_timeout=600
    worker.appserv2.socket_keepalive=1
    worker.appserv2.socket_timeout=300
    #failover node for appserv2
    worker.appserv2.redirect= appserv1
    #define load balancer
    worker.loadbalancer.type=lb
    worker.loadbalancer.balanced_workers= appserv1, appserv2
    worker.loadbalancer.sticky_session=true

     

    For detailed explanation of worker.properties you can refer the below Apache link:


    http://tomcat.apache.org/connectors-doc/generic_howto/loadbalancers.html

     

    There is also WIKI SCN link for configuring Apache as Load Balancer which provides some more detailing on worker.properties:

     

    http://wiki.scn.sap.com/wiki/display/BOBJ/Configuring+the+load+balancer

     

    You can also refer the KBA for configuring Apache as a Loadbalancer in 3.1:

     

    1529429 - How to create a Tomcat cluster with Apache Web server as load balancer and BusinessObjects Enterprise XI 3.1

     

    As given in the 3.1 configuration there are certain changes that need to be done in server.xml files of the tomcat system. The same for Tomcat 7 has been addressed in the below section.

     

     

    Step 4: Tomcat side configuration

     

    1. Navigate to <Install Directory>\tomcat\conf

     

    2. Back up server.xml file.

     

    3. Uncomment

     

    <Connector port="8009" protocol="AJP/1.3" redirectPort="8443" URIEncoding="UTF-8" enableLookups="false"/>

     

    4. You should set jvmRoute to support load-balancing via AJP. In server.xml file locate jvmRoute. The “jvmRoute=worker1” can be changed to worker 1 or worker 2. All the Tomcat's that you include in the cluster you will have to make the changes in each server.xml file.

     

    5. Now in server.xml file search for channelSendOptions="8", make sure you remove a ‘/’ after =”8”.

     

    6. Now copy the below parameters after channelSendOptions=”8”.

     

     

    <Manager className="org.apache.catalina.ha.session.DeltaManager" expireSessionsOnShutdown="false" notifyListenersOnReplication="true"/> 
    <Channel className="org.apache.catalina.tribes.group.GroupChannel">
    <Membership className="org.apache.catalina.tribes.membership.McastService" address="228.0.0.4" port="45564" frequency="500" dropTime="3000"/>

     

    <Sender className="org.apache.catalina.tribes.transport.ReplicationTransmitter">
    <Transport className="org.apache.catalina.tribes.transport.nio.PooledParallelSender"/>
    </Sender>

    <Receiver className="org.apache.catalina.tribes.transport.nio.NioReceiver" address="auto" port="4000" autoBind="100" selectorTimeout="5000" maxThreads="6"/>
    <Interceptor className="org.apache.catalina.tribes.group.interceptors.TcpFailureDetector"/>
    <Interceptor className="org.apache.catalina.tribes.group.interceptors.MessageDispatch15Interceptor"/>
    </Channel>



    <Valve className="org.apache.catalina.ha.tcp.ReplicationValve" filter=""/>
    <Valve className="org.apache.catalina.ha.session.JvmRouteBinderValve"/>



    <ClusterListener className="org.apache.catalina.ha.session.JvmRouteSessionIDBinderListener"/>
    <ClusterListener className="org.apache.catalina.ha.session.ClusterSessionListener"/>
    </Cluster>

     

    Explanation of the parameters:

     

     


    Engine


    This is the standard Engine element that defines Catalina as the component responsible for processing requests, to enable session replication; you must set the "jvmRoute" attribute to match the corresponding worker you have configured in mod_jk's workers.properties file. This value must be unique for every node included in the cluster.


    Cluster


    This is the main Cluster element, within which all other clustering elements are nested. It supports a variety of attributes, but in this simple example, we have only configured one, "channelSendOptions". This attribute sets a flag within Tomcat's clustering class that chooses between different methods of cluster communication. These options are outside the scope of this article, but a safe default setting is "8", which enables asynchronous communication.


    Manager


    This is the standard element that Tomcat uses for session management. When nested inside the Cluster element, it is used to tell Tomcat which cluster-aware session manager should be used for session replication.


    Channel


    This element communicates with a component of Tomcat's clustering solution called Tribes. This component handles all communication between the clustered nodes.


    Membership


    This Tribes-related element defines the address all nodes will use to keep track of one another. The settings we have used here are the Tribes defaults.


    Sender


    This Tribes-related element, in conduction with the Transport element nested inside of it, is used to choose from and configure a number of different implementations of cluster communication. Here, we have used the NIO transport, which generally provides the best performance.


    Receiver


    This Tribes-related element configures a single Receiver component, which receives messages from other nodes' Sender components. The attributes of the element allow you to specify addresses, buffer sizes, thread limits, and more. The settings we have used here allow the nodes to automatically discover one another via an address that Tribes will generate automatically.


    Interceptor


    Interceptor elements are used to make modifications to messages sent between nodes. For example, one of the Interceptor elements we have configured here detects delays that may be preventing a member from updating its table due to timeout, and provides an alternative TCP connection.


    Valve


    Tomcat's standard Valve element can be nested within Cluster elements to provide filtering. The element includes a number of cluster-specific implementations.


    ClusterListener


    This element listens to all messages sent through by cluster workers, and intercepts those that match their respective implementation's specifications. These elements operate in a very similar manner to Inteceptor elements, except that rather than modifying messages and passing them on to a Receiver, they are the intended recipient of the messages for which they are listening.

     

     

     

    The above given parameters are just a selective options that has been used & tested in-house. For detailed information on the parameters according to the configuration of your environment you can refer the following link:

     

    http://tomcat.apache.org/tomcat-6.0-doc/cluster-howto.html

     

     

    Thank You,

    Shriraj Suresh Vitkar

    BI 4 Platform Innovation and Implementation - openSAP Repeat Course

    $
    0
    0

    Are you an experienced BI System Administrator? Would you like to learn how to prepare for a successful and smooth BI deployment by implementing SAP BusinessObjects BI 4? Did you miss the first openSAP BI course, BI 4.0 Platform Innovation and Implementation? Then you’ll be glad to hear we’re repeating the course starting January 21, 2015!

     

    BI 4 Platform Innovation and Implementation is aimed at experienced BI System Administrators. During this course, participants will have the opportunity to practice with hands on exercises in their own Amazon cloud-based system to prepare for a successful and smooth BI deployment. This course is designed for experienced BI System Administrators, responsible for implementation, deployment and administration of SAP BusinessObjects BI 4.

     

    In this course, we’ll cover the following BI topics:

     

    • Week 1: Introduction, Architecture & Sizing
    • Week 2: Installation, Upgrade & Promotion
    • Week 3: Troubleshooting & Authentication
    • Week 4: Performance Optimization
    • Week 5: Final Exam

     

    There are almost 23,000 participants signed up to the first round of this course, which received great feedback. Here’s just a small selection from the I like, I wish forum in the first course. (You must be logged in and enrolled to view the I like, I wish forum)

     

    “Course contents are magnific. It has surprised me to discover lessons about topics that I have never seen it before in any official SAP BI courses”

     

    “Thanks a lot for the course. It helped me a lot to improve my skill set and answered many questions. I have already performed the Installation and setup for one of our customer. Now I have learnt new tips in this course. I am confident that in future set ups at client location use these tips.”

     

    “As a part-time admin for my company's internal BOBJ installation this is just what I needed to fill in the gaps since I am primarily a BI developer (not an admin). Really excited to have this available and would love to see more - especially on reporting tools. Very well done!

     

     

    Sign up for this popular openSAP course, BI 4.0 Platform Innovation and Implementation today!

    Secure Your BI Platform Part 1

    $
    0
    0

    In this new blog series, I will outline some of the best practices of securing your BI platform.

    We will take the approach of outlining what assets we need to protect, and based on a threat model analysis, outline the steps you can and should take to secure all aspects of the BI deployment.

     

    Are you absolutely secure?

    If you answered yes, you either blew up and burned down your entire IT infrastructure, or you are fooling yourself.  Security is all about risk management.   Let us therefore do a flyover around some of the ways to lock things down and manage our risk.

     

    In Part 1, we will look at securing the Identity Provider communication, and review how the data is stored.

    The main external identity providers are outlined above.   From a security standpoint, we are concerned about both the data moving across the network, as well as data about users stored in the CMS repository.

     

    Active Directory

    When using the BI Active Directory connector, the calls between the CMS and active directory are actually encrypted natively by the Microsoft infrastructure.  The good thing about this is that is you do not need to take any additional steps to protect the network communication for this purpose. 

     

    To access the Active Directory, query for and map users, the BI system requires AD credentials.

    The data at rest, meaning the data that is stored in the CMS database is going to be strongly encrypted.  Refer to my articles on data security in BI4 for more information on the specifics: Encryption & Data Security in BI 4.0

     

    The important consideration here then is, how much access the AD account (v8\bossosvcacct) above has in active directory.  You should always consider security in depth. What IF somehow the account is compromised.  How much damage could it do in your enterprise?

    This account only needs to list your Active Directory contents.  This is controlled with the "List Contents" right.  While the best practices for locking down Active Directory are a little beyond the scope, you can for example reduce the account's ability to query for additional user properties like email address.  Some examples contained in this external blog on hiding AD objects.

     

    The account that the SIA runs under should also run with minimum privileges.   Suppose the process gets exploited somehow or the credentials fall into the wrong hands.  You most certainly don't want the account to have the capability to create a user or grant permissions.

     

    In many cases, users will use the same account for querying active directory as they do for running the SIA.

     

    When creating the account in AD, while more tricky to sometimes setup , constrained delegation can allow you to limit the services for which a resulting kerberos ticket is used for.   While this is not supported for the OLAP on MSAAS when working with SSO to database, it should work for all other use cases and is a way to restrict the usage.

     

    The rights required on the local computer where the SIA is running are as follows:

    -Act as Part of the Operating System

    -Logon As a Service.

    -Read/Write to HKEY_LOCAL_MACHINE\SOFTWARE\SAP BusinessObjects\Suite XI 4.0

    -Read/Write to Install directory (specifically Write access to the Log Locations).

    The important part here is the account should NOT be an Administrator on the local machine.

     

    LDAP

     

    Unlike Active Directory, the logon via LDAP to the underlying identity provider will be sent in clear text over the wire unless you configure SSL.

    You can do this while going through the wizard or directly in your LDAP authentication screen after.

    At minimum, you should be using Server Authentication.   This will allow you to ensure that BI only connects to a trusted LDAP source, and will not send the LDAP credentials to an untrusted source, and of course encrypt the actual traffic, as it should be.

     

    Again, the details you store here are stored encrypted in the CMS repository, and the deep details are here:Encryption & Data Security in BI 4.0

     

    SAP

    Your SAP authentication also requires an extra step to be encrypted.  This is done by in the SNC configuration.  Notice you can set different levels on encryption here, and this applies to not only the queries sent from the CMS to the SAP system for user authentication, but also for data access as you build out your reports.

    But WAIT you say, I'm using OLAP and UNX and I use STS (the security token service).  Isn't SNC for the legacy xir3 content like UNV?

    SNC is ALSO a security layer.   The SNC settings for encryption will be used for the STS communication when setting up your SSO to BW.   The summary here is that you should be configuring SNC always, at least for the Authentication level of quality of protection, and avoid sending around credentials unencrypted.

    You will notice that BI4.1 now ships with a default SNC library to help with the configuration and potentially save you the extra step of downloading the libraries by using the "Use Default" setting for SNC library settings.

     

    In the next part of the security blog  series, I will look at protecting the web tier. 

     

     

     

     

    Simplified BI Pricing and Licensing FAQ Available

    $
    0
    0

    Back in Q3, we released a new, simpler pricing and licensing model.

     

    In Q4, we just release a FAQ document that publicly describes this simplified model.

     

    This can be found on the BusinessObjects BI Upgrade page. Here's a direct link.

     

    Please record your additional questions in the comments below, and I will incorporate those into v2.

     

    Thanks, Blair

     

    Blair Wheadon

    GM of Enterprise BI

    @blairtwheadon

    Secure Your BI Platform Part 2 - Web Tier

    $
    0
    0

    In my previous blog, I covered securing of the communication of your authentication providers.

    In this posting, we will cover the configuration of the web tier.   It is your war file deployment, and probably the most exposed part of your deployment, especially if you're facing the public web.

     

    Reduce the attack surface.

    The less you have deployed, the less that can be attacked.   Although the default BI install will deploy a number of components, you likely don't need them all.

    You may see a list like this of war files deployed:

    AdminTools - designed for running advanced direct queries against the BI repository.   If you don't use this, remove it.   You could also consider running it on a separate, local access only deployment.

     

    BOE - This is the core of the BI deployment, includes CMC, BI Launchapd and OpenDocument functionality.  Note that using wdeploy, you can split the CMC and BI Launchpad deployment, and put the CMC functionality on another, more locked down application server.

     

    dswsbobje - web service used by Crystal Reports for Enterprise, Dashboard designer, and your custom applications.  Again something you can remove if none of the above apply to you.

     

    BusinessProcessBI - this is an SDK which is not needed for core functionality.  If you're not deploying custom applications that make use of this, this is something you can remove from your deployment.

     

    clientAPI - contains Crystal Reports ActiveX controls for custom application deployment.  You can almost certainly remove this.

     

    MobiServer & MobileBIService - if you are not deploying mobile, you should have no need for these.

     

    docs - This is the default tomcat documentation.  They are also available online, so there should not be any need for these to be deployed.  They contain information about the version of tomcat which is not necessary.

     

    Tomcat Security

    Refer to your tomcat guide.  The following is an excerpt from the tomcat guide on default web applications:

    Tomcat ships with a number of web applications that are enabled by default. Vulnerabilities have been discovered in these applications in the past. Applications that are not required should be removed so the system will not be at risk if another vulnerability is discovered.

    http://tomcat.apache.org/tomcat-7.0-doc/security-howto.html#Default_web_applications

     

    Apache regularly publishes its list of fixed vulnerabilities here:

    http://tomcat.apache.org/security-7.html

    BI SP's regularly bundle updates of Tomcat.  SAP continually monitors the bundled applications and works to deliver any updates as part of the regular maintenance cycle.  We regularly monitor the security listings of tomcat, and use that to drive our updates.

    If you are unable to stay on the latest support packages, you may want to consider reviewing the list of vulnerabilities and using your own update of Tomcat at least until such time when you can deploy the latest BI4.x support pack.

     

    Tomcat User Account

    The user account only needs to read files under tomcat.  Create a user for the tomcat service account, give the service account "Logon as a User" rights, and read only rights on the tomcat folder.

     

    Secure the communication channel - Use TLS

    This should be a fairly well accepted policy already.

    While terms like HTTPS and SSL are thrown around, this should really mean "TLS" behind the scenes.  TLS is a newer protocol for secure communication.  SSLv3 has now been rendered insecure, and you should be configuring your application servers to use the TLSv1 or higher protocol.

    If you are not using SSO exclusively to logon to the BI web apps, (likely to be the case with CMC which does not support SSO), you should be encrypting the traffic and logging on with HTTPS.   Otherwise, the logon credentials will be passed from the browser to Tomcat or the application server of you choice in clear text over the wire.

     

    You've heard of POODLE?  Disable SSLv3 in Tomcat while you're at it.

     

     

    Do you use flash?  Dashboarding, aka XCelsius

    The BI install installs a file called crossdomain.xml.  It's an XML document that grants a web client—such as Adobe Flash Player, Adobe Reader, etc.—permission to handle data across multiple domains.

    The default is very inclusive,

    <cross-domain-policy>

        <site-control permitted-cross-domain-policies="all"/>

        <allow-http-request-headers-from domain="*" headers="*" secure="false" />

        <allow-access-from domain="*" secure="false" />

    </cross-domain-policy>

    and you should take steps to lock it down if you will allow hosting of flash based content.

    As this configuration file is completely outside of the SAP BI control, please refer to Adobe's documentation for crossdomain.xml

     

     

    Protect Credentials

    If you're setting up Active Directory SSO, make sure you're not storing the credentials as a java option, but protect the password with a keytab instead.

    Don't do this (notice the wedgetail.idm.sso password in clear text):

     

    Do this instead:

     

    1. Create a keytab with the ktpass command

    The details for this are contained in the whitepaper attached to sap note http://service.sap.com/sap/support/notes/1631734

    The whitepaper is a must for anyone setting up AD for the first time.

     

    2. Copy the.keytab file to the c:\windows\ directory of the application server

    3. Add the following line to C:\Program Files (x86)\SAP BusinessObjects\Tomcat\webapps\BOE\WEB-INF\config\custom\global.properties idm.keytab=C:/WINDOWS/<your keytab file name>


    If you're using Trusted Authentication, make sure you secure the shared secret file, so that only the process that your web application server is running as can access it.   Consider using OS file level encryption to further lock this file down.



    Web Application Container Server

    If you are using the WACS, to host your restful web services, or possibly the CMC, the configuration for secure communication is done through server properties in the CMC.

     

    What about Cross Site Scripting, SQL Injection, OWASP TOP 10?!   IS IT SAFE!!??

     

    SAP has a very strict release criteria, and a secure development cycle implemented.  Testing includes, and is not limited to, static code scanning, dynamic analysis tools, manual penetration testing and security architecture reviews.   You can find out more about our security processes here:

     

    Conclusion

    The secure approach is to treat your internal network that all your end users access as compromised.   Just think of the latest Sony attack as an example, to see the value of encrypting the communication channels.

     

    Additionally, leveraging firewalls to block off parts of the network to would be attackers is also valuable.  Firewalls and server communication will be covered in a later blog post.

     

     

    Feel free to add you comments/questions on other areas, the blog will get updated with any additional bits that may have been missed here.


     

    How to get some additional traces from the BI Platform Web Apps

    $
    0
    0

    Have you ever seen a log file in your Application Server directory called TraceLog_<pid>_<date_timestamp>.glf?  Ever wondered what that was?

     

    This log file is generated from a number of the BI Platform Web Applications and by default will contain on Error level messages.  Here is an example of one I found on my test machine:

     

    Found in Directory: C:\Program Files (x86)\SAP BusinessObjects\tomcat\

    Filename:  TraceLog_1140_2014_11_20_05_23_52_898_trace.glf

    Contents:

     

    |64DF6F8D078E466397CBCD8D875B98240|2014 11 20 05:23:52.904|-0800|Error| |==|E| |TraceLog| 1140|  18|Start Level Event Dispatcher| ||||||||||||||||||||com.bo.aa.layout.DashboardManager||underlying implementation doesn't recognize the attribute

    java.lang.IllegalArgumentException: http://javax.xml.XMLConstants/feature/secure-processing

      at org.apache.xerces.jaxp.DocumentBuilderFactoryImpl.setAttribute(Unknown Source)

      at com.bo.aa.layout.DashboardManager.setDocBuilderFeaturesForXXE(DashboardManager.java:134)

      at com.bo.aa.layout.DashboardManager.<clinit>(DashboardManager.java:161)

      at com.bo.aa.impl.DBServerImpl.<clinit>(DBServerImpl.java:397)

      at com.bo.aa.servlet.AFBootServlet.InitServers(AFBootServlet.java:80)

      at com.bo.aa.servlet.AFBootServlet.init(AFBootServlet.java:47)

      at com.businessobjects.http.servlet.internal.ServletRegistration.init(ServletRegistration.java:81)

      at com.businessobjects.http.servlet.internal.digester.WebXmlRegistrationManager.loadServlets(WebXmlRegistrationManager.java:127)

      at com.businessobjects.http.servlet.internal.digester.WebXmlRegistrationManager.registerRest(WebXmlRegistrationManager.java:209)

      at com.businessobjects.http.servlet.internal.ProxyServlet.readXml(ProxyServlet.java:368)

      at com.businessobjects.http.servlet.internal.ProxyServlet.registerInternal(ProxyServlet.java:395)

      at com.businessobjects.http.servlet.internal.ProxyServlet.register(ProxyServlet.java:317)

      at com.businessobjects.http.servlet.config.WebXmlConfigurator.register(WebXmlConfigurator.java:60)

      at com.businessobjects.bip.core.web.bundle.CoreWebXmlActivator.start(CoreWebXmlActivator.java:66)

      at org.eclipse.osgi.framework.internal.core.BundleContextImpl$1.run(BundleContextImpl.java:782)

      at java.security.AccessController.doPrivileged(Native Method)

      at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:773)

      at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:754)

      at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:352)

      at org.eclipse.osgi.framework.internal.core.AbstractBundle.start(AbstractBundle.java:280)

      at org.eclipse.osgi.framework.internal.core.AbstractBundle.start(AbstractBundle.java:272)

      at com.businessobjects.http.servlet.Activator.startBundle(Activator.java:129)

      at com.businessobjects.http.servlet.Activator.start(Activator.java:116)

      at org.eclipse.osgi.framework.internal.core.BundleContextImpl$1.run(BundleContextImpl.java:782)

      at java.security.AccessController.doPrivileged(Native Method)

      at org.eclipse.osgi.framework.internal.core.BundleContextImpl.startActivator(BundleContextImpl.java:773)

      at org.eclipse.osgi.framework.internal.core.BundleContextImpl.start(BundleContextImpl.java:754)

      at org.eclipse.osgi.framework.internal.core.BundleHost.startWorker(BundleHost.java:352)

      at org.eclipse.osgi.framework.internal.core.AbstractBundle.resume(AbstractBundle.java:370)

      at org.eclipse.osgi.framework.internal.core.Framework.resumeBundle(Framework.java:1068)

      at org.eclipse.osgi.framework.internal.core.StartLevelManager.resumeBundles(StartLevelManager.java:557)

      at org.eclipse.osgi.framework.internal.core.StartLevelManager.incFWSL(StartLevelManager.java:464)

      at org.eclipse.osgi.framework.internal.core.StartLevelManager.doSetStartLevel(StartLevelManager.java:248)

      at org.eclipse.osgi.framework.internal.core.StartLevelManager.dispatchEvent(StartLevelManager.java:445)

      at org.eclipse.osgi.framework.eventmgr.EventManager.dispatchEvent(EventManager.java:220)

      at org.eclipse.osgi.framework.eventmgr.EventManager$EventThread.run(EventManager.java:330)

     

    For those of you that have spent time looking at these types of messages, you will likely recognize a few things.  The first, is that the bulk of this error message is a Java backtrace. Backtraces are often read from the bottom up and it gives you an idea of the sequence of calls that occurred leading up to the error.  In this case, we can see the error: 

     

    com.bo.aa.layout.DashboardManager||underlying implementation doesn't recognize the attribute

    java.lang.IllegalArgumentException: http://javax.xml.XMLConstants/feature/secure-processing


    Which tells us what caused the error trace log entry, but we might be more interested in what happened leading up to this error. For that, we can traverse the backtrace to get an idea of what was going on before this error. 


    In this case, I have no idea what actually caused this error.  I just found it on my test machine from around 3 weeks ago.  But from the backtrace, I can made an educated guess that the cause was related to a Dashboard layout of some sort.  Regardless, this is not the purpose of this blog so I will move on.


    The error messages found in these TraceLog*.glf files are not usually enough to properly troubleshoot an issue.  To get proper details around what causes an issue, we need have more verbose logging.

     

    One way we can enable verbose logging for the BI Platform Web Apps is by enabling it in the CMC.  Section 25.4.1 in the BIP Administrator's guide covers how to do this.  In the CMC, you can enable traces for the BI Launchpad, CMC, Open Document, Promotion Management, Version Management, Visual Difference and Web Services applications.

     

    Another way to enable tracing for the BI Platform Web Apps is to follow the below steps.  I have found added details in the these log files that wasn't available through the CMC enabled logs:

     

    Steps to setup Verbose logging for the TraceLog Application server traces (example for Tomcat)


    1. Go to this folder and copy the BO_Trace.ini:  C:\Program Files (x86)\SAP BusinessObjects\tomcat\webapps\BOE\WEB-INF\TraceLog
    2. Paste this file in the C:\Program Files (x86)\SAP BusinessObjects\tomcat directory and rename it to TraceLog_trace.ini
    3. Edit this file and change the line:
      sap_trace_level = trace_error;

          to

      sap_trace_level = trace_debug;
    4. Find the line below it and change it as well:
      sap_log_level = log_error;

          to

      sap_log_level = log_info;
    5. I also like to set the append = true; to append = false; which will use the Process ID and Date/Time stamp in the naming convention of the log files.
    6. Save the TraceLog_trace.ini file and within a minute, you should start seeing some log files growing in the Tomcat directory.

     

    Here is an example of what my log files contain after enabling the above log levels:

     

    |039A2887DCF24130ADA77A3BA3DBF3A6155|2014 12 17 14:57:25.015|-0800|Information| |==| | |TraceLog| 1144|  47|http-bio-8080-exec-7| ||||||||||||||||||||com.businessobjects.bip.core.web.bridge.ServletBridgeLoggingDelegate||servletbridge.relative.url.prefix.out.of.bundle: ../..

     

    |039A2887DCF24130ADA77A3BA3DBF3A6156|2014 12 17 14:57:25.015|-0800|Information| |==| | |TraceLog| 1144|  47|http-bio-8080-exec-7| ||||||||||||||||||||com.businessobjects.bip.core.web.bridge.ServletBridgeLoggingDelegate||servletbridge.relative.url.prefix.to.root.of.bundle: ../../IVExplorer

    We can see that the type of entry is "Information" now which tells us our settings are being used.

     

    Now, this trace is quite verbose so really the only time I would recommend using it is when you can reproduce an issue in a short period of time.  To deactivate the trace, you just edit the TraceLog_trace.ini file and set the trace/log levels back to *_error. 

     

    Do not delete the file as this will not deactivate the current trace levels.  Just edit and save the file to deactivate it.  If you do delete the file, you will need to restart Tomcat to disable the traces again.

     

    Any ways, this trace can sometimes give you additional details that are not available in other tracing methods.  Be sure to deactivate it as soon as you are done using it though as it does have a slight impact on performance.

     

    Thanks,

    Jb


    Deep dive into the BI 4.x Audit data model

    $
    0
    0

    Overview:


    Auditing is an important out of the box solution, to keep a track on the usage pattern of the SAP BOE platform. Audit data is relevant both from an administration perspective, as well as from compliance perspective for maintaining audit trail for a specified interval of time. While sample audit universe acts as a starter kit to start reporting on audit data (http://scn.sap.com/docs/DOC-53904), a knowledge on the underlying data model helps us build our own queries & reports and optimize them better for performance. The starting point of understanding how auditing works and what information is audited, refer to the relevant chapters in the BI Platform admin guide, downloadable at : http://help.sap.com/boall_en/. e.g. in sbo41sp3_bip_admin_en.pdf, chapter 21 and 33 talk about auditing. There are also several insightful blog posts on auditing and audit reporting by 'Manikandan Elumalai' on SCN.

     

    Any SQL examples shown in this blog post are based on audit database hosted in Oracle. However, the same can easily be adapted to any other query language syntax, as the table structures remain same.

     

    Audit Data Model:

     

    Audit database is designed for both transactions and querying. Audit data is continuously being written to this database by BOE and at the same time audit reports / queries can be fired on it to report near real time audit information.

     

    There are two main transaction tables in audit database: ADS_EVENT and ADS_EVENT_DETAIL. Remaining tables are either lookup or bridge tables. Any auditable action in BOE is captured as a unique Event_Id stored in ADS_EVENT and each Event_ID will have one or more detail records (Event_Detail_Id) in ADS_EVENT_DETAIL. Both the Event and its corresponding Detail can be of specific types and can have other supporting attributes.

     

    This core concept of auditing has remain unchanged since BO XI 3.1, though the number of tables have increased significantly in BI 4.x audit database. The increase in number of tables is primarily due to increase in the attributes being captured and more normalization of the data structures.

     

     

    BO XI 3.1 Audit Data Model

    BOXI3.1_Audit.JPG

     

    BI 4.x Audit Data Model

    BI4.1_Audit.JPG

     

     

     

     

    Audit Data Dictionary:

     

    The best way to analyze audit database, is to use a GUI based database client like Oracle SQL Developer. The following queries are helpful in listing the data dictionary:


    -------

    select owner, object_name, subobject_name, object_type
    from all_objects
    where owner = '<Schema Name where audit tables are created>'
    order by object_type, object_name;

    -----

    select owner, index_name, index_type, uniqueness, table_name, table_type
    from all_indexes
    where owner = '<Schema Name where audit tables are created>';

    ------

    desc <each table name>;

    ------

    A clear trend which comes up, based on the output of the above queries:

    • Only tables and indexes are present in audit database. No views, procedures, materialized views etc. exists
    • There is no enforced referential integrity between the tables i.e. no primary and foreign keys
    • Index type is normal and either unique or non-unique
    • Due to multilingual support being available by default in BI 4.x, all lookup tables (names ending with _STR) have 'Language' as an additional field
    • The field EVENT_DETAIL_VALUE in ADS_EVENT_DETAIL is of datatype CLOB. Remaining columns in all the tables are of either varchar2, numeric or date datatypes.

     

    Building Audit Queries:


    Common audit reporting scenarios may have metrics like Count of Events, Last <Event Type> Timestamp, Count of Users. All these metrics can be derived from the table ADS_EVENT. Supporting details for an event can be obtained from ADS_EVENT_DETAIL. Description of attributes can be obtained from the lookup tables after joining with either ADS_EVENT or ADS_EVENT_DETAIL tables. It is important to apply suitable filter to the queries to optimize performance. Common filter criteria may be based on date, event type, detail type, language etc.

     

    Example scenario:  Reporting user group membership details for users, who have logged into BOE in past 30 days:


    ----

    SELECT DISTINCT USER_NAME, USER_GROUP FROM (
    SELECT ae.USER_NAME USER_NAME,
    dbms_lob.substr(ad.EVENT_DETAIL_VALUE,2000,1) USER_GROUP
    FROM ADS_EVENT ae, ADS_EVENT_DETAIL ad WHERE ae.EVENT_ID = ad.EVENT_ID
    AND ad.EVENT_DETAIL_TYPE_ID = 15 --Denotes detail type: User Group Name
    AND ad.event_detail_value not like 'Everyone%' --To eliminate the 'Everyone' group records
    AND exists
    (select 1 from ads_event X where X.event_type_id = 1014 --Denotes event type: Logon 
    and X.event_id = ae.event_id and X.start_time >= sysdate-30))
    WHERE rownum < 50001
    ORDER BY USER_NAME;

    ------

    The above query converts CLOB data type to varchar. Once converted, regular string functions can be applied on the results like order by, distinct etc.

     

    Concluding Remarks:

     

    The above write-up is not an exhaustive reference on audit database. Readers are encouraged to validate the above contents in line with standard BI Platform admin guide. Comments are welcome to further enhance the contents of this blog post. Thanks for your time

    Securing your BI Platform part 3 - Servers

    $
    0
    0

    Communication to Identity providers like Active Directory, LDAP and SAP was covered in part 1, and securing the web tier was covered in part 2.

    Now let's look at the actually BI servers, like the Central Management Server, (CMS), File repository Server (FRS) and others.

     

    We'll look at port restrictions, potential firewall setups, SSL/TLS and other configuration switches.

     

    FIPS 140-2

    By now you may have read about the -fips parameter on the SIA.  FIPS stands for Federal Information Processing Standard.  I cover this mode more in my data security blog.  The quick summary is that BI4 uses FIPS certified encryption libraries to perform its encryption.  


    Turning this switch on (add a "-fips" on the SIA command line), prevents usage of older clients and disables some older functionality.  If you do not have any xir3 clients or custom applications running against your BI4 system, there is no reason NOT to have this switch on.  Do expect this to become the default in upcoming maintenance releases, where you will instead need a special switch to turn ON old functionality, but by default, and xir3 or older client will NOT be able to connect.

     

    It is not just about enforcing stronger BI4 security.  By disabling older functionality, you again reduce the attack surface, where a server not accepting calls based on older functionality will be harder to exploit.  If you're familiar with the POODLE attack, you'll know for example that the latest recommendation is to outright disable SSLv3 protocols and use strictly TLS.   A similar concept applies here . 



    Minimum Privileges

    Creating a special locked down user to run BOE can be worthwhile.  The built in windows system account is actually quite powerful

     

    The rights required on the local computer where the SIA is running are as follows:

     

    -Logon As a Service.

    -Read/Write to HKEY_LOCAL_MACHINE\SOFTWARE\SAP BusinessObjects\Suite XI 4.0

    -Read/Write to Install directory (specifically Write access to the Log Locations).

    The important part here is the account should NOT be an Administrator on the local machine.

     

     

    Server to Server channel Encryption (CORBA SSL)

    The how to steps for server to server communication encryption are detailed well in the BI4 admin guide, as well as in this online wiki for unix:

    The client configuration is detailed in sap note 1722634

    How much of a performance hit can you expect?    It really depends on many factors, there is often a tradeoff in performance for security, but a rough guidance can be a 10%-20% impact based on what I have seen so far. 



    File Repository Server

    This is an important server to protect, because it contains your report content on the file system.  If the reports are saved as PDF or saved with data, that makes them very valuable to attackers.  There are a few additional things you can do to protect the content.

    -Secure the FRS OS folders so that only the account that the SIA hosting the FRS can access

    -Use file level encryption.  This can protect the content from unauthorized access through the local machine. 

    -Virus Scanning.  For large deployments and heavy usage, this can be a big bottleneck on the I/O to the point that performance visibly suffers.    For performance reasons, you may consider running scheduled scans in "off hours" rather than real time virus scanning.  By far, real time virus scan is more secure, but you can further mitigate with locking down what users can upload. 

    -Limit content types from being uploaded:

    Rather than granting the generic "Add Objects" right, you can actually lock it down to content types, and only permit CR, Webi etc types of documents.  This will prevent a user from uploading a bad executable or batch file, that another user then downloads and executes on their own machine.  Of course one would hope that end users would know better, but prevention is your best defense. 


     

    Default Accounts

    All BI installations start with a default "Administrator" account.  For a potential attacker, that is one known piece of information for trying a brute force attack.  Enabling auto lockout for failed attempts will certainly help mitigate this, however another thing you can do is to rename the default account.  Instead of "Administrator" use your own naming such as <Company>_BI_Admin.  For example SAP_BI_Admin.

     

    Stale Accounts

    Have people left the company?  Maybe never even logged in?  The less accounts you have, the less chance of an old stale password falling into the wrong hands, or accounts being misused.  It is again about reducing attack surface.

     

    The following query, which you can run using the AdminTools console, will return to you a list of users by the last logon time.

    SELECT SI_NAME, SI_LASTLOGONTIME FROM CI_SYSTEMOBJECTS WHERE SI_KIND = 'user' ORDER BY SI_LASTLOGONTIME DESC

    Below is a stripped down sample output.  While these users may have content in personal folders you don't want to lose, consider disabling the accounts.

     

    Ports, Firewalls

    Firewalls help you reduce the attack surface.  In the simplest, happiest (from a security standpoint) workflow, all your users are web users, and will only be connecting to BI Launchpad.  In this case, the BI servers can be fire-walled away from the end users.  However chances are you also have thick clients connecting.  In this case, make sure the thick clients are limited to connecting from a trusted network zone, if networks are partitioned.

    You can bind servers to a specific port in the CMC.

     

    The CMS has both the the name server and request port that you can configure:

     

    By setting a specific range of ports to use or binding to specific ports, you can then use a firewall to further lock down and reduce the attack surface of your servers.

     

    Keep in mind that thick clients must be able to communicate with the CMS, as well as the Input and Output file repository server.  There is a fairly complete overview of the server port communication described in the administration guide, section 8.14.2

     

    Your IT may have also put your database layer into a separate network zone, inaccessible to regular workstations.  Yes, IT is making your life difficult, but for a good reason in the classical 3 tier architecture.  Clients can and should (for security purposes) connect through the BI platform which in turn connects to the database layer.  This extra hop makes it more difficult for a connection to abuse or attack the database layer directly, where all your valuable data resides. 


    Database Encryption

    The communication between the BI processing servers and the actual database can, and from a security standpoint should be encrypted.  To help you decide, a threat model should be done.  How sensitive is the data, how isolated are the data sources are just two considerations.   Generally, one should assume that their network HAS been compromised, and build out a security in depth approach.  It is quite easy for someone in your company to fall for a phishing attack.    You can set database encryption at the driver level, below being an example of a sql server driver:

     

    CMS DB Encryption

    The CMS repository does not store any data in your reports, however it can store sensitive metadata such as connection information.  This is automatically encrypted using a two key mechanism as part of the BI4 build in encryption.  Again, this is described in my encryption & data security blog.

    Using your database vendor's built in database encryption to encrypt the whole data may actually be overkill here, and is actually not something that I would strongly recommend as being necessary, but certainly a valuable 'security in depth' principle option.   The advantage of selectively encrypting content, the way the BI4 process does is that you do not suffer performance hits on non essential data encryption, such as the metadata associated with a report's layout.

     

    Temporary Files

    During document creation and processing, temporary files will be created, and they may contain some data.  Have a look at your temporary folders, and lock these down to the process that the SIA service hosting these servers is running under.   See below for the Crystal Reports processing server as an example.

    Placeholders like "%DefaultDataDir% and others are defined under the placeholders tab of your Server Properties.

    %DefaultDataDir% defaults to "/SAP BusinessObjects Enterprise XI 4.0/Data/"

     

    SAP BI 4.1 SP5 IDT Business Security & Data Security Profile Filter Implementation

    $
    0
    0

    SAP BI 4.1 IDT SP5 Business Security & Data Security Profile Filter Implementation

     

    Environment:

     

    SAP BO 4.1 SP5

    IDT - Single Source UNX

    Windows 2008 Server

     

    Implementation Scenario:

     

    My project has a requirement where we need to implement Security on top of the Universe Class/Objects, such a way that certain Departmental users will have only access to subset of objects of their department say eg: Purchase, Receipts. Now, we want to use the same universe for other users who is not in any department but need access to all the Classes/Objects. Now the catch is that those department users will have an extra filter condition in the definition of Security Profiles.

     

    I will explain this scenario with the actual problem I faced and how I did a work around to overcome that issue ! Initially my assumption was this is Column/Object level secuirity can be achieved thru Business Security Profiles ONLY if you have filters required on top of Class/Objects. Which turned to be FALSE ! Some play around with the security profiles helped me understand the reason(s) behind it. I covered it in later part of this blog..


    To start with I defined 2 Business security Profiles

     

    1. Created View with Purchase Only objects in IDT on my Single Source UNX.

    2. Created another View with Receipts Only Objects in the same universe.

     

     

    Created 2 Security Profiles

     

    1. BS_Purch_Only_Profile with filter PO_Type = 'BULK'

         - GRANT Purch_only object

     

    2. BS_Receipt_Only_Profile with filter R_PO_Type='BULK'

         - Grant Receipt Only object

     

     

    My requirement is that I want to define Business Security Profiles in IDT such a way that USER1 in both the above profiles need to have above filters separated in the WebI Query. Currently the Net Business Security Profile Query with the above approach result in PO_Type = 'BULK' AND R_PO_Type='BULK' in WHERE Clause of WebI report on either Purchases OR Receipts.

     

     

    This generation of 'AND' for the filter is by SAP's Design(for Business Security profiles) and can be handled thru Data Security Profile.

     

     

    ===========================================================================================

     

     

    Observations and Work around:

     

     

    i) Filters in Business Security Profile always comes with 'AND' in the Where clause(Net Profile Security) if the user is in multiple Business Security Profiles. So, there is no other way to overcome this issue as this is by SAP Design.

     

    The reason for this sort of design is that Classes and Objects can spread across Subject areas. So, these filters primarily helps at the Classes and filter levels irrespective of tables that are hitting the database. If you have further level of security requirement you need to control at the Row level fetch which is nothing but the Data security Profile.


    ii) I used Data Security Profile to apply Filters and Business Security Profile to Display subset of Objects say Eg: Purchase vs Receipts Class Objects; in IDT  as per the requirements.

     

    Steps in Detail:

     

    - Create Data Security Profile on top of the Purchase Only View.

    - Add filter to the Data Security Profile: DS_PO -> Rows -> Add PO_Type = 'BULK' condition.

    - Similarly Create Data Security Profile  for Receipts Only View DS_Receipts -> Add R_PO_Type='BULK' condition;

     

    Now USER1 is assigned both the Data(DS_PO & DS_Receipts) and Business Security Profiles(BS_Purch_Only_Profile & BS_Receipt_Only_Profile).

    The Net Security Profile gives my expected results !

     

    NOTE: You cannot test Security profiles in IDT -> Business Layer -> Queries but you can check the Net Security Profile in Security Editor, by selecting the Universe on which Profiles are defined and the User together !


    Validate:

     

    In WebI, when USER1

    Query1: Creates a query with Purchases ONLY View, this will generate ONLY PO_Type = 'BULK' filter instead of both the filters as in the initial problem

    Query2: Create a query with Receipt ONLY view, will result ONLY R_PO_Type='BULK' filter

     

    Both the Queries executes in One Report and give me two separate tables.

     

    The above 2 steps helped to implement my requirement.

     

    Advantages:

     

    - Some Users are only in One Group they will have only one required filter instead of all the filters in the WHERE clause !

    - Users not in any Security Profile will have NO Filter conditions. This is awesome feature as per me !!

    - Replace my USER1 with a Purchase Or Receipts Department Group created in CMC. So, the entire department's security is controlled !

     

     

     

    Hope this helps who want similar security implementations ! Encourage you to add Questions/comments on similar issues.

     

    Thanks,

    Chithresh

    SAP BusinessObjects BI4 Implementation Report NOW Available

    $
    0
    0

    Dear SCN Community Members,

     

    We are please to announce the availability of the SAP BusinessObjects BI4 Custom Implementation Report. With this report, we will help you understanding the best option to implement your SAP BusinessObjects BI4 deployment based on your organisational requirements. Based on a set of questions and your input, an Implementation Report will be generated containing a long list of recommendations and links to relevant content to further enable you in deploying SAP BusinessObjects BI4 successfully.

     

    Implement | SAP BusinessObjects Business Intelligence Solutions 2015-01-20 13-19-20.jpg

     

    You can run your own Custom Implementation report via : https://www.sapbusinessobjectsbi.com/implement/

     

    Please share your feedback with us!

    Regards

    Merlijn Ekkel

     

    Director BI Solutions | SAP GMT BI | Solution Management

    OpenSAP BusinessObjects BI 4 Training

    $
    0
    0

    A fantastic opportunity for you to learn more about BusinessObjects BI 4 is currently being offered by SAP via OpenSAP.

     

    Enroll now:

     

    https://open.sap.com/courses/bifour1-1

     

    Here is the course summary:

     

    “We live in a world where big data, people, machines and processes are interlinked in an internet of everything. Immense value can be unleashed by connecting this information to the work we do every day, enabling us to quickly discover what is happening and then act with the power of collective insight. Learn how to unleash this power by implementing SAP BI 4 with our new SAP BusinessObjects BI 4 Platform Innovation & Implementation Training course offered through openSAP.

    Successful deployments require proper sizing, hardware, configuration, security and administration. This course, designed for experienced BI system administrators, is brought to you by the Strategic Customer Engagements Team, who are SAP’s most senior SAP BusinessObjects BI specialists.”

     

    Enjoy the learning experience!

    Viewing all 317 articles
    Browse latest View live


    <script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>