Monday, May 25, 2009

Install performance Point 2007 on Windows 2003 SP2

I faced this problem while installing performance Point on a Virtual Environment that has windows 2003 Service Pack 2 installed

to do this you need to execute the following command in your Run prompt

msiexec /i PscSrv.msi SKIPOSCHECK=true

 

Installing PerformancePoint Monitoring Server on a Domain Controller

1. If you are using a domain account, you must add it to the Administrators and IIS_WPG groups before installing the Monitoring Server. You don’t need to do this if using Network Service (recommended).

2. To add a domain account to the appropriate groups: Use Start\Administrators\Active Directory Users and Computers:

a. Add it to the Domain Administrators group Add it to the IIS_WPG group

3. Install the Monitoring server MSI using this command line:

a. msiexec /i PscSrv.msi SKIPOSCHECK=true

Wednesday, May 20, 2009

How to Install Performance Point Services Service Pack 2

The Below Diagram will help you installing and Planning your PPS SP2 Deployment.

Just for the memo: you can’t install PPS SP2 on the free trial version of PPS. Still working on a work around

3533280325_5abf253a023534061500_53f6db77eb

BIDS Helper 1.4.1.0

  CodePlex has just released a new version of the BIDS which is a tool that extends the BI Development Studio with a large set of features found below from Code Plex

 

image

 

Features
Installation

To install BIDS Helper, download the installer from the Releases tab.
If for some reason you cannot use the installer the latest release includes an xcopy deploy option.

Tuesday, May 19, 2009

Slicing Analysis Services (SSAS) Partitions and more from Dan English's BI Blog

Just wanted to touch base on a couple of items with partitions referencing the ever popular Adventure Works 2008 sample Analysis Service project available to download from CodePlex here – SQL Server 2008 SR1 downloads – it appears that they have bundled the databases together now, used to be broken out by the operational and the data warehouse.  The sample project file will be part of this download for Analysis Services and will be located in the following directory if you go with the default setup - C:\Program Files\Microsoft SQL Server\100\Tools\Samples\AdventureWorks 2008 Analysis Services Project.

Now if you open up the Enterprise version you will see that the measures groups have multiple partitions defined for them.  One thing to note is that the counts are not properly updated and that aggregations have not been defined for all partitions in the cube.

image

If you have BIDS Helper installed you can go ahead and use the add-in to perform an update on the estimated counts for all of your objects.

image

Just a warning, if you are going against a very large database you might not want to perform this operation.

image

As a work around you could go ahead and simply update the estimated row count in the properties for the partition to provide Analysis Services an estimated row count when you define and generate the aggregations for the partitions (actually there is a Partition Count setting that you will see when you go through the aggregation design wizard that gets used by the algorithm when creating aggregations, so it is important to set these values).

image

After the estimated counts have been updated you will see that the counts are updated, but you would still need to design aggregations for the partitions where these have not been defined yet (you might need to save the cube file, close it, and reopen the file to see the updated counts).

image

To create the aggregations in SSAS 2008 you have to switch over to the new Aggregations tab in the cube.  You can then select the group that you want to design the aggregates for and walk through the wizard.

image

And you can generate the aggregates for all of the partitions at once.

image

Ok, now let’s get back to the partitions portion.  I am going to make a modification to the Internet Sales partition for 2004 and break this out into Q1 and then place Q2 in a separate partition.  This really doesn’t need to be done for the Adventure Works data since the volume of data is extremely small, but in a real world scenario this could definitely improve query performance (plus reduce processing time if you are just processing the latest quarter instead of the entire year).

image

I went ahead and modified the query of the existing 2004 partition so that the cutoff was less than 20040401 instead of 20041231 for OrderDatekey.  You need to be careful that you do not overlap the ranges, because there is no validation going on, so you could potentially include data that is already in an existing partition.  After I had modified the existing partition for 2004 and added the new partition I went ahead and updated the estimated counts.

image

Now that this is done let’s run a query against the cube and take a look at what is going on.  Here is the query that I will execute against the cube:

SELECT {Measures.[Internet Sales Amount], 


        Measures.[Internet Order Quantity]} ON 0,


NON EMPTY([Customer].[Customer Geography].[Country]*


        [Product].[Category].[Category]) ON 1


FROM [Adventure Works]


WHERE ([Date].[Calendar].[Month].&[2004]&[4])


image



So the query executed and if we look at the Profiler trace we can see that the query actually touched multiple partitions for the measure group that we were querying information from.  The reason for this is because we have modified the basic partition from a single year.  If we would have left it at the year level it would have been fine, but since we are dividing this up into multiple parts now it does not know where to retrieve the data to satisfy the query.



Lets go back into the partitions in the cube and set the ‘Slice’ property for the partitions.  This is where you specify the tuple that defines the partition.



image



Now that we have this setup we will redeploy and run the query again.  You will need to define the ‘Slice’ property on each of the partitions in the measure group.



image



UPDATE (5/16/2009): fixed this picture (before I had highlighted the aggregation, not the partition)



Now that we have defined the ‘Slice’ for the partitions we see that our query only touches the partition that we are querying against and it was faster in response timeSmile  Granted this is a small set of data and using this doesn’t really make too much of a difference, but you can imagine what this would do to a very large dataset.  And if we switched over to reference the Fiscal hierarchy instead we would see the same results.



That is it for now, hope you enjoyed this little tip and I want to thank Siva Harinath and Howie Dickerman from Microsoft for their presentation at last year’s Microsoft BI Conference Designing High Performance Cubes in SQL Server 2008 Analysis Services where they pointed out this item

The Data warehouses and the Average of Averages

Currently working on a project that uses Data entry screens as a Data source for users. The Data will be polled from an Operational Database to the Data warehouse in Quarterly bases and users Enters the values of the KPIs in Quarterly Bases Only

The Problem is if we have a Measure that is calculated using the following Formula (A+B)/C . What Should I store in the Data warehouse is it the Actual Value or the value of each metric alone.

Anyone faced this design before. If yes Any Ideas?

Monday, May 18, 2009

Performance Point : Programmatically Create a new Scorecard

Working with PPS programmatically requires an understanding on the PPS Web Services

Below is the first blog entry to help other on working with PPS programatically to do some actions

void CreateScoreCard ()

    {

        Scorecard scorecard = new Scorecard();

        scorecard.Guid = Guid.NewGuid();

  /////////Even the following Lines are optional,

  ///If you didn’t enter a name for the scorecard it will be created with an empty name

        BpmPropertyText NameProperty = new BpmPropertyText();

        NameProperty.Text = "The ScoreCard";

        scorecard.Name = NameProperty;

        //To publish the scorecard on the server

        Publisher.CreateScorecard(scorecard);

    }


kick it on DotNetKicks.com

Sunday, May 17, 2009

Ralph Kimball vs. Bill Inmon

I came Across this wonderful comparison between Ralph Kimball and Bill Inmon in the way data marts Should be built I decided to share it on my blog for your reference

Areas of agreement

  • Consensus on need for solid business requirements and end-user validation
  • Agreement that it is rarely feasible to build an entire warehouse at once - incremental development with a focus on high-priority elements
  • Conformed dimensions are desirable
  • Warehouse data needs to be tracked as atomically as possible
  • Star schema is most desirable format for data marts

Areas of differences

  • Bill Inmon
    • Approach is known as Top-Down, or Corporate Information Factory
    • Warehouse data should be stored in a centralized relational structure
    • Dependent data marts should be created from central warehouse
      • These data marts will employ star schemas
      • Additional transformations may be employed between the warehouse and the data marts
    • ODS is used for transaction level detail with little to no history


Inmon Kimball

  • Ralph Kimball
    • Approach is known as Bottom-Up, or Kimball Bus Architecture
    • “… The data warehouse is nothing more than the union of all the data marts …”
    • Conformed dimensions are the glue which unite disparate data marts, while Inmon maintains conformed dimensions within centralized relational structure
    • ODS may be integrated directly into warehouse

kick it on DotNetKicks.com

Thursday, May 14, 2009

Office 2010 Features List

I came across an amazing list of features from Office 2010 I decided to publish it here for reference

Here’s what is known, at this point:

Microsoft officials are continuing to decline to comment on Office 14’s timetable or feature list. Nonetheless, there have been a few bits of information about O14 which have gone public. Among them:

Upgrade SharePoint SQL Express to Standard or Enterprise

Source : Todd Klindt's SharePoint Admin Blog

Microsoft so very graciously provides a free version of SQL 2005, SQL Express, with MOSS. If you install MOSS using the Basic option or Single Server under Advanced you get SQL Express automatically. So what if as a budding newbie SharePoint admin you chose the Basic option, but now as a wise aged SharePoint admin you've seen the error of your ways and want to use a more respectable version of SQL for your SharePoint backend? You're in luck. In this blog post I'll walk you through upgrading SQL Express to SQL Standard or Enterprise.

First thing you need to do is get a copy of SQL 2005 Standard or Enterprise. Which version you choose depends on the redundancy and availability you want. Either will upgrade from Express. After you have your media you can start the install. This step is important as you have to pass the setup program a parameter to let it know you're doing an upgrade. To upgrade use the command setup.exe SKUUPGRADE=1. It should look like this:

The setup should kick off and it won't give you any indication that you passed it a parameter. Never fear, it will come up later. When the setup gets to the instance selection make sure you don't accept the default instance and you choose the SQL Express instance. First click Installed Instances:

Then choose the OFFICESERVERS instance from the list and hit OK:

Your next screen should confirm that it found SQL Express:

If you click the Details… button you'll see that the setup confirms that SQL Express can be upgraded. This step is optional.

At this point you can Close the box and hit OK until the setup is finished. After the setup is finished you'll want to apply SP2 for SQL 2005 and any post SP2 patches that are available. I don't know if it's required but I always do an IISRESET after this to make sure that SharePoint reconnects properly to SQL.

That's all there is to it. Once you are using full SQL you have quite a few more options available to you like log shipping, database replication, SQL Profiler and more. If you have any questions about how to leverage those tools with SharePoint, leave me a comment and let me know.

tk

Tuesday, May 12, 2009

Try your Reports Before Publishing it

I’ve came a cross a site that gives you an option to publish your reports online and see it working before adding it to your site try it. It’s An Amazing Idea

image

http://www.reportsurfer.com

Report Surfer is a community site for users to upload, share and run sample reports built on Microsoft's SQL Reporting Services platform

Integrating CS2007 with Moss 2007

So, if you have been wanting waiting to use Office SharePoint Server 2007 in conjunction with Commerce Server 2007, this is the whitepaper for you.

http://www.microsoft.com/downloads/details.aspx?FamilyId=2AEB1A5E-43B8-483B-8CB2-86C0E82BF0AB&displaylang=en

This document covers the topics needed to take Commerce Server data and services and make them visible in portal sites based on Office SharePoint Server 2007. Samples are provided to show how to make custom Web parts that access the Commerce Server APIs.

Microsoft Office 2010

Microsoft released some information about SharePoint 2010 and Office 2010, this is conslidated list with information released till now

  • http://www.microsoft.com/visualstudio/en-us/products/2010/default.mspx
  • http://blogs.msdn.com/somasegar/archive/2009/02/19/sharepoint-tools-support-in-visual-studio.aspx
  • http://channel9.msdn.com/posts/VisualStudio/Sharepoint-Development-with-Visual-Studio-2010/
  • http://download.microsoft.com/download/C/0/9/C0965791-049B-4200-9008-F07A783026F6/VisualStudio2010_ProductOverview.pdf