My experience : C_HANAIMP131 SAP HANA 1.0 SP6 Associate certification

Disappointing when I realize it has been two years since I have blogged on WordPress. As a matter of fact, the last time I did it was when I cracked the BW 7.0 associate certification. Without further ado, I invite you to my personal point-of-view on C_HANAIMP131 (SAP HANA 1.0 SP6) exam. I managed to pass the exam on April 29th, 2014.

 

Why get certified on SAP HANA?

Well this must be a generic question on everyone’s mind and maybe not specific to the SAP HANA world. The cost of the exam is quite towards the higher side. As I know, in North America it would cost about $500 through Pearson Vue, or at least that is what it used to cost when I used to work there. Down here in the Indian sub-continent it would cost about Rs. 39,000. So my point, it is expensive and is a major decision, to take or not. Jotted below are my reasons with descending priority.

Knowledge : This is something that I realized as an outcome of my previous certification. I had appeared for the BW 7.0 exam(C_TBW4570) two years ago, right around the same time. I never realized its value when I prepared ardently for the exam and then managed to pass it. The knowledge I gained preparing for the exam gave me an upper hand until even this day at my job environment. I ever so often run into situations where in I am thrown into a completely new scenario and fortunate for me, with my knowledge acquired from the exam preparation, I usually manage to assess and handle the situation appropriately. SAP certifications are sort of their product manuals, the syllabus touches upon every concept/functionality of that product. If you seriously have to prepare towards clearing the exam, you really do not have room to ignore any topic. This ensures the examinee to know and understand every aspect of the product thence being sure of the quality of the examinee.

Writing two exams two years apart, I feel SAP has improved quite a bit on the quality of the exam. The HANA exam I wrote now had a mix of approximately ~60% direct questions and ~40% scenario based and twisted questions. Unless you know the concept well and have worked on it, getting the twisted ones right would not be easy. The pass percentage was 61% but beware as more consultants get certified and newer service packs get released, the pass percentage goes up. Look it up on the SAP website to know the latest pass percentage(there could be a slight chance that even this may not be up to date).

Confidence : Passing the exam builds a certain level of confidence. This is predominantly vital in today’s ever competent work environment. It is not “the sole” requirement for succeeding at your work-place, of course that requires a lot of experience on each of the scenarios within the product, but it arms you with the confidence that you have a fundamental knowledge around all aspects of that product.

Selling yourself : Of course, let’s all candidly agree that this is the primary reason for most of us sitting the exam. The chance to throw it on your resumes, attaching the logo in your corporate email signature, talking about it during those corporate team meetings or just a strong point during your appraisals. Well, I say it’s a valid and good reason. Flaunt away your accomplishment with all glory! You worked for it, you deserve to boast about it. But, I would have to admit that not all interviewers go by the certificate. All I could say is, if there is a call to be taken between a certified and non-certified consultant who perform quite similarly at their interviews, the certificate gives that extra bit of assurance on the quality of the consultant.

 

How to prepare for a HANA certification ?

People have different ways to prepare towards an exam and so I have my own. But there are few things common among all and I’d like to call out. Preparing for SAP certification exam is a serious decision to take. It is expensive, requires the examinee to be self-motivated and focused to go through mountains of material. A perfect preparation would be a minimum of 6 months of detailed hands-on on the product followed by a 1 months rigorous study plan for around 5-6 hours a day. This is just my personal take, it could vary depending on the individual. Also, SAP might require specific experience requirements to ensure the quality of the examinee.

Experience on the product HANA : As I mentioned above, this exam has several twisted question. You really need to get yourself very comfortable with the tool and work on some real life scenarios. Just building an attribute or analytic view might not be adequate. Most of the questions come from advanced modeling and there are quite some questions on optimization and performance tuning of models. Unless you modeled, failed and then rectified , these questions would be hard to nail. Actual experience on HANA would help you prepare and memorize all the theory and concepts required for answering the direct questions.

I had quite some experience in building POC scenarios for different customers. I had been working on and off on HANA for the last 1 year or so. I started working on it out of personal motivation, signing up for an AWS m2.xlarge instance. It was expensive for me, but I looked at this way, how much would you pay to have first hand experience on an aggressive product as HANA, pretty much SAP’s game changer(revenue churner or whatever you call it) over the last decade. With that thought, everything felt reasonable. Later on I was given access to my organization’s bare metal demo system.

One thing good about SAP HANA, is that SAP has evolved out of their blackbox approach. The SAP environment was not really available for the masses to learn and explore like other software majors. With HANA, SAP opened the doors and made their product available through various cloud service providers.  In fact, SAP does not charge for the HANA instance, you only pay the cloud service provider their respective service charges. For self preparing people out there like me, my advice would be to take a 30 day trial subscription on Cloudshare(entirely free) and then maybe switch to Amazon as it works out a bit cheaper compared to its peers. I don’t vouch for the service providers, it is your call to compare the prices, hardware configs, accessibility, schemes etc. and decide upon the right HANA cloud service provider. Each of them compare to the other differently based on various parameters. Anyways, rather than digressing from the topic, it is worth taking a connection and playing with every scenario scenarios in the exam syllabus as your would have almost all privileges and accesses possibly required.

There are topics that may not be available for learning purposes to the masses. Topics such as backup/recovery, different provisioning methods, BW interactions with HANA, BOBJ interactions with HANA  etc. would not be possible to simulate on the developer cloud systems. I fortunately was able to get a basic display level access in my organization for most of these scenarios. If you do not have that privilege, I suggest search around the internet, watch videos, read documents, it would help you mentally imagine the activities on each of these scenarios.

Training : I have never had the opportunity to sit in an SAP’s official trainings and so I am not in a position to judge it. But I believe the quality would be good as it allows you to appear for the exam without really having an experience. If you are thinking of a third-party training, I have negative view-point. There are loads of material, how-tos, guides, YouTube videos on SAP websites. Going through them would help you get a good feel of the product. As SAP has taken an open approach for HANA, you could use Google to almost get to any topic of interest. I am not going to list out the links here, there are several blogs at SDN which already have this organized.

Study Material : I prepared for the exams from the SAP training material HA100, HA300 and HA350. Apart from this I went through  a lot of videos , SDN blogs, Teched(now called SAP D-code) contents  from 2012 & 2013, ASUG documents etc.  While I prepared, I used to go through each topic, try to simulate them in AWS and search videos and blogs on them to get a better understanding of the topic. Fortunately I was given official access to the material through some partnership program my organization had with SAP. If you do not have access to these materials, you still would be able to prepare for the exam by scouring the internet and reading up the topics from various blogs and other media contents. You just would have to painstakingly search for all the content.

 

General guidelines for preparing towards the exam ?

Before the exam : If you have access to the HA100, HA300 and HA350. You really need to go through them patiently. I went through them three times, had prepared a 60 page summary note which I had additionally gone through couple of times. My dedicated preparation for the exam spanned over a month and a half. Questions could come out of almost anywhere, maybe not even a text in the book, but just from one of the images which you might not have paid attention to. It is imperative to go through them patiently and paying attention to each every line, footnote, tooltip, image, exercise etc.

It took me nearly a month to go through all the books the first time as I patiently took summary notes and simulated all the scenarios. The second revision was done in about a week’s time and the remaining revisions I was able to cramp them in another week just before the exam. Even after all of this, I felt stumped at some questions. There are two types of questions, single right answer and multiple right answers. In case of multiple right answers, there would be a note mentioning the number of right answers(I know, phewww!! Smile).

Even good guy Greg agrees…

Good guy

Either ways, even the wrong options are too tantalizing. So if you have only a basic understanding of the topic, you would fall right into the trap.

At the exam : You will have 80 questions to answer in 180 minutes. I think this is very reasonable. Especially, if you are from an engineering background, having gone through those Math and Physics exams with heavy calculus problems, this exam should be not much of a hurdle. The time is more than adequate. Never jump the gun. Read a question patiently and read the answer choices patiently. I ran into several questions where the answers where all looking right. The approach I took was to go through each answer choice and try prove it incorrect against the question. Finally the ones that stand the best chances would be the right ones. Basically, take educated guesses.  You have a lot of time, so use it! There is an option where you could flag a question and come back at it later. So maybe you could flag all the tough questions and go through them at the end. This would give you that extra bit of confidence that you have answered majority of the questions already.

After the exam : Talk about it, boast about it, flaunt your certificate. Try to educate and motivate others towards the exam!

I happened to notice only just before the exam that the next version has already been released. C_HANAIMP141 based on SP7. It was too late for me to change the exam code and had to stick with C_HANAIMP131.

Comments?; Questions?; Feedback?

If there is something that I could help(within agreeable limits) I’d be glad to assist! Hey! Real value of knowledge is when it is shared.

But,

If you are planning on asking for the material, ask the grumpy cat… He’s the boss!

Preview

I despise use of dumps or enquiring questions from those who have already appeared for the exam. You would never walk out of the exam hall satisfied and you also demean the efforts of those got it the hard way. So, if you’re planning on asking me for something of that sought, speak to king Boromir.

boromir

Cheers!!! Smile Smile Smile

My Experience : C_TBW45_70-SAP Certified Application Associate- Business Intelligence with SAP NetWeaver 7.0

Finally nailed it! I am an SAP BI certified associate now. It may not be a great deal to many but to me it is a great feat.

I have a total IT experience of 5 yrs 6 months (as of today) during which I was exposed to a slew of technologies that deliver business intelligence at various layers (Integration, warehousing, reporting, analytics). My experiences has been predominantly application support (I know, not so dandy as implementation projects).   Over the last two and a half years I had the oppurtunity to be a part of team where there was a lot of SAP being used. I was not assigned for the SAP components(SAP BI primarily), but that didn’t stop me from dirtying my hands in SAP BI. I subscribed to Michaelmanagement’s quarterly subscription for SAP system access(quite expensive, but I needed it badly) and played around.  I found SAP BI very interesting and eventually decided to shift my carrier to SAP. I figured a certification would be good to start with. That was how it all began. . .

I dedicated 3.5 months towards the preparation of the exam. I used to spend at an average 10 hours each day during the last 1 month. To make up for the lack of implementation experience or a training warranted the effort. I believe motivation is very important to keep you going, especially when you need to be continuously plugged in to your books. (Un)Fortunaly I was going through a tough turn in my career, I used this to motivate me towards achieving the certification. Something life taught me . . .

Preparation Specifics

All you need are the books mentioned in SAP’s official web page (apart from the motivation of course). TBW10, TBW20, TBW41, TBW42 P1, TBW42 P2 and TBW45. You would need to read them thoroughly at the least thrice (I did). Questions could turn up from unexpected nook and corners. The questions get twisted too, which means just reading wouldn’t do, you would need to make sense out of them as well. It is worth creating a summary booklet with summarized bullet points. It helped me quite a bit going through the summary booklet during the last couple of days rather than browsing through the endless ocean of pages. 

I appeard for the exam on 03/16/2012 and came out with 80%. It feels real good once you get something you crazily longed for, especially if you slogged for over three months 🙂 . Some emotions in life are simply priceless!

InfoPackage/DTP Package Size

I learnt something new today about Package size while extracting from ECC into BI 7.0.

I happened to notice that one of the datasources which executes in delta mode from ECC to BI was transferring a very high number of packages. What was strange was that, although the number of packages were high, the number of records that each package contained was significantly less. This aroused my curiosity!

I did know that BW internally allocates the package size, but I did not know the criteria on which this was being done.

On reading a few posts at SDN, I figured out the logic behind this golden number called Package size.

[Credits to original posters at SDN]

Package size = MAXSIZE * 1000 / size of the transfer structure,but not more than MAXLINES.

I tried simulating and it worked out like a gem!

Read on to see the steps involved in this simulation. . .

Continue reading

SAP BI Cube Zero Record Timing problem

This post is all about another problem that I faced in our production system. I had to do significant analysis/simulation to understand the problem.

When you have a configuration as below

  1. Datastage job to post into BW (3.x datasource)
  2. Infopackage triggered is of type – Update PSA and data target in parallel, typically a write optimized DSO
  3. You use an external scheduling tool to trigger the datastage job
  4. Have another BW chain to extract data from a DSO and post into a cube which is also triggered by an external scheduler

There is a possibility that your DTP load from the DSO to the cube could post empty data.

The reason why this happens is that, BW takes a lot of time to actually update the data in the WO DSO. BW responds back to datastage as soon as it recieves all the data and does not really wait for the data to get saved in the DSO.

If your BI system is slow for any reason, the external scheduler would trigger your DTP for the cube much before the data is actually saved in the DSO. This eventually results in zero records being transferred to the cube.

A solution to this problem is to only post to the PSA by datastage and have a process chain to push data to the DSO and DSO to the Cube.

 

Sap Datastage – Unable to delete variant problem

It has been a while now since I have created a post. I have slowly migrated SAP’s  BI product and have been learning it. Recently we hit a major issue with datastage working with SAP.  I had to perform extensive simulation/analysis to understand the problem and come to a solution. I would like to share the analysis with everyone (could save some precious time). . .

Explanation

Multiple calls are made by datastage to the function module Z_RFC_DS_SERVICE with different parameters. Accordingly different activities such as create variant, delete variant, load program etc. is done.

Z_RFC_DS_SERVICE is the function module invoked remotely by datastage through the RFC.

Z_RFC_DS_SERVICE invokes the function module RS_VARIANT_DELETE

This function in turn invokes the form check_v_used_in_job.

perform check_v_used_in_job using rsvar-report rsvar-variant.

This function invokes the function module BP_VARIANT_USED_IN_JOB

Depending on the response from this function module the “unable to rename/delete variant” error is thrown. If the number of jobs in TBTCP is not equal to the number of entries in TBTCO which has finished or aborted status, the error will be thrown.

Continue reading

Data Mining | Scores and Models

What is scoring?

Scoring is a well known concept under data mining. For a newbie, Data Mining is the usage of data from a data warehouse for marketing. Data Mining involves complicate logics applied to get the maximum out of the data available with you on sales, guests, inventory etc.

Coming back to our topic, scoring is one of the methodologies under data-mining. Scoring can be applied to several dimensions such as guests, items etc.

Example :

Consider that you are planning to send out an offer on discount of Bakery items. Consider you being a retail giant, will have around 100 million customers. Sending offers to all of them would prove an expensive investment. Hence, you would rather prefer to send it to customers who have a history of purchasing bakery items. That is where scoring comes in. The probability that a customer will buy a bakery item again will be granted as a score to that guest. Say a score of 0 will be the least probability that he will purchase a bakery item and 1 the maximum. This directly depends on the number of times he/she has already purchased bakery items.

What is the Scoring Process?

scoring

The above picture summarizes the scoring process. I shall explain in detail each of the sections.

Segment

Segment is the category of data to which the scoring process is to be applied. An example would explain it well.

Example:

Say I decide to run the offer of discount in Bakery items to Women of the age category 30-40. This becomes the segment that forms as a candidate for the scoring process.

Model

Model is the logic or rule being applied to the data to decide what score it gets. Models are usually written in an XML type of language called PMML(Predictive Model Markup Language). Designing a model itself is a critical and complex process. Usually a dedicated team is available for this.

Example:

The model decides how to interpret the data. An example of a model would be

(Total amount spent on bakery Items by the guest)/(Total amount spent by the guest on purchase)

Basically it is just the calculation that needs to be applied on the data to generate the score.

Scoring Engine

The scoring engine is responsible for applying the model on the input segment of data. It could be a Procedure, Java based application etc.

Score

The output of the whole scoring process is the score. The score need not be 0-1 always. It could be any kind of desirable range value.

Campaign

Just having the scores does not complete the work. Now comes the work of campaigning. Once the scores have been assigned, a selection is made on the scored data to identify which need to be chosen or eliminated.

Example:

Once scoring is done for all women in the segment 30-40, we have a range of scores(0-1). The guests can be ordered by scores and the first 30% can be chosen for sending out the offers.

Summarizing everything

A segment of data is taken from a large Data Warehouse to perform the scoring. This selection is decided usually by the Business teams of the retail firm. Usually a view or table is created for each segment as there would be several segments and several models. A model is to be applied to these segments. The model is again done usually by a dedicated modeling team. Model is the statistical analysis that needs to be done for the segment to assign the score. Once the score is generated by the scoring engine the scores are saved in a ordered format. Again as there are multiple models and segments, usually dedicated tables are created. The campaigning team runs a selection criteria on the data based on the scores and decides the audience for the discounted bakery offer.

Firing up db2 9.5 on Ubuntu Hardy Heron

I work on IBM DB2 at office. So, I was curious to set it up at home in my new lappy. It was fun and I did it successfully.

I found it quite difficult to set it up though, but after a few hick-ups I was able to make it. Even though I ran the command line setup, the engine never started when I gave the db2start statement. After a lot of googling I found a solution ironically on a Russian website. 🙂

De-package

sudo dpkg -i db2exc_9.5.0-1gutsy1_i386.deb
Set the annoying environment variable

sudo echo “kernel.shmmax=1610612736” >>  /etc/sysctl.conf

Run the Set-up

/home/Prabhu/exp $ . ./db2_install

Start the Engine

db2start

Create the SAMPLE database

db2sampl

Refresh

db2stop

db2start

Connect and Enjoy

db2 connect to sample

ftp timeout in AIX

Ever wondered how you could increase the timeout of an ftp connection. The default ftp timeout limit for for an AIX OS would be 15mins. This can be easily increased.

  • Edit the file /etc/inetd.conf file.
  • Go to the line where ftp configuration is done. (Run a search in vi)
  • add the following “-t <time-out limit>”
  • for further details please go to the man pages of the ftp daemon. “man ftpd”.

Updating a large table with partitions

I came across a situation where I had to update a very large table. It had 373 million records. A simple query could time out or you would end up with a snapshot too old error.

UPDATE EMP

SET EMP_LOC_C = ’03’

WHERE EMP_LOC = ‘NY’

 

Now if this table has more than a billion records, the query might not succeed.

The ideal solution is to run an update partition by partition. You can get the partitions for your table by querying dba_tab_partitions.

 

SELECT TABLE_NAME,PARTITION_NAME

FROM DBA_TAB_PARTITIONS

WHERE TABLE_NAME LIKE ‘EMP’

P_001

P_002

. . . . .

P_030

Now what you could do is to let the query run against each partitions separately.

UPDATE EMP PARTITION(P_001)

SET EMP_LOC_C = ’03’

WHERE EMP_LOC = ‘NY’

 

 

Moreover if there is an index built on any of the columns you can use that as well and split the query even further.

UPDATE EMP PARTITION(P_001)

SET EMP_LOC_C = ’03’

WHERE EMP_LOC = ‘NY’

AND EMP_I > 8612751936

 

So this way you can split and update the table w.r.t. partitions.