Hi Guys,
I want to know how to check if HANA live is installed in my HANA Box.
Can you please help?
Thanks,
Adlin
Hi Guys,
I want to know how to check if HANA live is installed in my HANA Box.
Can you please help?
Thanks,
Adlin
Hello Sai,
there was a similar question in space SAP HANA and In-Memory Computing: Correct answer (Re: Cognos on SAP HANA?) pointed to SAP note 1577128.
As I have no clue, I will move your question to that space.
Regards,
Thomas
Hi,
Getting below error while activating Attribute view in HANA medeler.
Repository:Enhanced an error in repository runtime extension;Deploy Attribute view:no view attributes
Create View DDL statement:CREATE COLUMN VIEW "_SYS_BIC_"."SYS-LOCAL.PUBLIC.<SCHEMA>/AT_REGION" TYPE JOIN WITH PARAMETERS ( ) NO STRUCTURED PREVILAGE CHECK
Hi Sakthikumar,
As far as I know this error comes when no column of attribute view is added to output.
It would be good if you can provide few more details of your model.
Best Regards,
Shireesha
As an SAP employee, you might be better off asking for the explanation of not documented parameters in an SAP internal forum or directly from the development team.
One thing I can tell you right away is that guessing what a parameter does (even the "obvious" ones) usually doesn't get the correct answer.
enable_remote_cache -> creates a cache on the remote server? creates a cache that is rather remote? caches data for access by remote systems? ...
Thaks a lot.
I have a Tenant whitch is migrate from a simple contenant Hana.
I'm just trying to create 2 new tenant and do some tests on theses ones and it works.
I must export and import my datas from the old tenant to another.
Hi Lars,
You mentioned "the compression optimization only is performed when a lot of data has changed - or when it manually asked for", do you know if HANA upgrade will trigger this process automatically?
We upgrade our HANA from SPS07 to SPS10, after upgrade complete, we found HANA performance is very slow due to optimize compression started against multiple big tables we had in HANA, this make HANA system un-usable for hours, do you if there has way to control this behavior?
Thanks,
Xiaogang
In the past range partitioned tables were compressed to rarely, e.g. BW fact tables. This has changed with SPS 10. So if you mainly see range partitioned tables being compressed, this special improvement with SPS 10 can be the reason. As a consequence you should see a significantly reduced memory footprint. Theoretically you can set the auto_decision_func of the compression optimization back to the previous default value (temporarily), but usually this shouldn't be required.
Hello , I need to rename my HANA SYSTEM from hostname.old to hostname.new and I am using hdblcm tools for that however , I am not sure if I have to manually rename the system hostname before running hdblcm? if yes, i assume HANA DB will not start .
Can someone provide any insight with there past experience here.
Thanks
HI Hitesh,
thanks for your reply. We have added all authorizations to our ECC data provisioning db user. However, issue remains. Our ECC uses MS SQL db and only existing schema containing DD16S is native SAP schema RAZ (which is ECC SID), There is no SYSTEM schema.
It seems that HANA somehow gets information (or concludes) that table DD16S resides in SYSTEM schema.
Kind regards, Josko.
Hello, after successfully creating a Virtual table T008T in HANA from ECC, select from that table gives an error: Could not execute '..ECC_RAZ_T008T"' SAP DBTech JDBC: [403]: internal error: Remote execution error No metadata found for logical table T008T for query ... T008T is Pool-Table type from Pool ATAB, what to do? Regards, Sven
Hello Amurya,
You can check below link.
Considerations About Renaming HANA Node Hostname | SCN
Regards,
Yuksel AKCINAR
Hi Experts,
I am trying to debug the calc view. Found info in the help.sap.com as choose the performance icon dropdown for debug. Only users with SELECT and CREATE ANY privilege on _SYS_BIC can debug calculation views.
http://help.sap.com/saphelp_hanaplatform/helpdata/en/d5/1dbabf5be34b408b60bc94ec65243e/content.htm
I dont see the debug option in the modelling editor. how to find out whether my user has those privileges for debug and also how to debug a graphical calc view or analytic view?
Thanks,
Prasad
Hi folks,
I have a question about the duration of the KNN Text Categorization (TM_CATEGORIZE_KNN function in Text Mining) in HANA.
I have a train (labelled) table which has 24600 records. It has text, maincategory, subcategory columns where maincategory and subcategory are my labels. Additionally, I have another table which has 8095 records to be predicted for each label. When I start the process, it takes about 140 seconds to finish all predictions for 8095 record (both for two labels) and insert the results into one final table. What will happen, when I have 8 million records to be predicted (assuming the train table size will remain same -actually it may increase as well-)? Will it take 140000 seconds which is about 38 hours? Is that normal or is there a way to increase the speed of the process?
Note: I am using aws r3.2xlarge instance type which has 8 cores, 61GB memory, 1x160 GB SSD. Version is 1.00.110.00.1447753075
For this process, I created an outer procedure (KNN_CHURN_TEST_OUTER) which reads unlabelled records from a table, and an inner procedure (KNN_CHURN_TEST_INNER) which makes predictions for a record (I have two labels, so it makes two predictions for each record). For each record, I call inner procedure from outer procedure.
Thanks,
Inanc
Here is the outer and inner procedures.
CREATE PROCEDURE "SYSTEM"."KNN_CHURN_TEST_OUTER" () LANGUAGE SQLSCRIPT AS BEGIN /***************************** Write your procedure logic *****************************/ DECLARE new_text NCLOB; DECLARE id INT; DECLARE CURSOR c_products FOR SELECT "id","text_data" FROM "SYSTEM"."AVEA_CHURN_TABLE_TEST"; FOR cur_row as c_products DO new_text := cur_row."text_data"; id := cur_row."id"; call "SYSTEM"."KNN_CHURN_TEST_INNER" (id, new_text); END FOR; END;
CREATE PROCEDURE "SYSTEM"."KNN_CHURN_TEST_INNER" (IN id INT, IN new_text nclob) LANGUAGE SQLSCRIPT AS BEGIN /***************************** Write your procedure logic *****************************/ DECLARE sub_cat NVARCHAR(128); DECLARE main_cat NVARCHAR(128); DECLARE num INT := 0; DECLARE num2 INT := 0; DECLARE CURSOR c_products FOR SELECT T.CATEGORY_VALUE, T.NEIGHBOR_COUNT, T.SCORE FROM TM_CATEGORIZE_KNN( DOCUMENT :new_text MIME TYPE 'text/plain' SEARCH NEAREST NEIGHBORS 22 "text" FROM "SYSTEM"."aveaLabelledData" RETURN top 1 "main_category" from "SYSTEM"."aveaLabelledData" ) AS T; DECLARE CURSOR c_products2 FOR SELECT T.CATEGORY_VALUE, T.NEIGHBOR_COUNT, T.SCORE FROM TM_CATEGORIZE_KNN( DOCUMENT :new_text MIME TYPE 'text/plain' SEARCH NEAREST NEIGHBORS 22 "text" FROM "SYSTEM"."aveaLabelledData" RETURN top 1 "sub_category" from "SYSTEM"."aveaLabelledData" ) AS T; open c_products; begin FOR cur_row as c_products DO main_cat := cur_row."CATEGORY_VALUE"; num := num + 1; END FOR; IF :num = 0 THEN main_cat := 'unknown'; END IF; end; close c_products; open c_products2; begin FOR cur_row2 as c_products2 DO sub_cat := cur_row2."CATEGORY_VALUE"; num2 := num2 + 1; END FOR; IF :num2 = 0 THEN sub_cat := 'unknown'; END IF; end; close c_products2; insert into "SYSTEM"."KNN_RESULTS" values (:id, :new_text, main_cat, sub_cat); commit; END;
Dear Experts,
Recently we have started conversion activities in SAP HANA systems.
Can anyone of you suggest, on how to capture the DB growth. Also we want to capture the Top 100 tables which grew and if possible certain objects too.
Jun
Maybe this link will be helpful to capturing growth of HANA DB and tables
You could try the SQL scripts from Note 1969700
I found SAP HANA Academy - SDI: ABAP Adapter [SPS 11], which looks great and probably can read any ABAP table but it's batch, not real time.
Hi Experts,
I am trying to schedule a job, but i need to make sure that the job is scheduled only once.
In xscron i mentioned : xscron: "* * * * * * *"
What actually does happen if we enter all the parameters of the xscron as *.
And what should be done in order to schedule the job only once.
Thanks,
Abhishek.