Skip to main content

Collaborate 2011 - Day 2

The second day of Collaborate started with a opening general session by Dan Thurmon. Dan is a so called "motivational speaker". And he sure knows is job. In a very American style he promoted his dogma:"Off Balance, On Purpose", demonstrated with juggling (even with axes on a one-wheeler). A very funny start of this day!
The first "real" session I attended was SQL Techniques by Tom Kyte. It was all about Clustering, setting up Index Organized Tables (IOT's) and Partitioning. The goal of all these techniques is to reduce IO. A nice metaphore he used was: You can put your clothes in a closet by just dumping them on the first free spot you see. So inserts are fast, but then retrieval trakes a full scan of your closet. By clustering pants, sweaters and socks together, inserts may be slower, but retrieval is way faster! But not only picking the right storage approach is important, also the retrieval - like using bulk/array fetching - are both important to reduce resources and increase performance and thus scalability. Don't tune queries, tune your approach!
Next on the agenda was Analytic Functions - Revisited. I am already familiar with this technique, but there still are some hidden gems to explore, like the package dbms_frequent_itemset. This package contains functions to find correlated data. I have to check that once. Also a warning: If you use analytic functions in a view, that might prevent the use of indexes.
The last before lunch was called Why BPEL when I can PL/SQL?. It was a 30 minute power session, clarifying when BPEL might be a better choice over PL/SQL. If you need advanced Fault Managemenrt, using multiple non-Oracle techniques (like SQL Server, DB2 etc) or Human Interaction, BPEL has advantages over PL/SQL.
The first presentation after lunch was Tips and Techniques Integrating Oracle XML DB. Due to some (or a lot of) technical difficulties the presenter couldn't do his demo in SQL Developer and had to stick to the slides - which contained lots of code samples. Couldn't bring his point that way. 
Metadata Matters was originally a Tom Kyte presentation (I think I've seen it before), but now done by one of his Oracle colleagues. Very well done I must say. The point was: Put as much (meta)information into the database as you can, and let the optimizer decide what to do with it. The more it knows, the better it can do it's job! So put in all constraints, and even datatypes are constraints! And also "size does matter". If a developer defines all strings as varchar2(4000) - just to be sure - then the amount of memory needed for returning a result set is way bigger than when you specify the real lenght. Check constraints can be used by the optimizer to rewrite queries - and reduce IO and speed up performance. If you define columns as NOT NULL you create more access paths for the optimizer, like using and index scan instead of a table scan. You can even "tweak" this for columns that can be NULL, by appending a fixed value to thee column value when creating the index - just to be sure every row is represented in the index.
Referential constraints can be used to remove tables from the query plan. And constraints can also be used for applying a query rewrite (so a materialized view can be used for delivering the result set instead of scanning the original table(s)). 
Most important: Java applications shouldn't handle all integrity and rules, because they can't do it as fast and as good as the Oracle database does that job. Don't try to reinvent that wheel, Oracle has over 25 years of experience in that field and it's all at your disposal! So don't think you can do better within the timeframe of your project!
The last presentation of today was Giving Winning Presentations by Rich Niemiec. Funny, entertaining and with lost of tips on how to make your point during a presentation. I don't like his slides though.... ;-)
Now off to the Welcome Reception!

Comments

Popular posts from this blog

apex_application.g_f0x array processing in Oracle 12

If you created your own "updatable reports" or your custom version of tabular forms in Oracle Application Express, you'll end up with a query that looks similar to this one: then you disable the " Escape special characters " property and the result is an updatable multirecord form. That was easy, right? But now we need to process the changes in the Ename column when the form is submitted, but only if the checkbox is checked. All the columns are submitted as separated arrays, named apex_application.g_f0x - where the "x" is the value of the "p_idx" parameter you specified in the apex_item calls. So we have apex_application.g_f01, g_f02 and g_f03. But then you discover APEX has the oddity that the "checkbox" array only contains values for the checked rows. Thus if you just check "Jones", the length of g_f02 is 1 and it contains only the empno of Jones - while the other two arrays will contain all (14) rows. So for

Filtering in the APEX Interactive Grid

Remember Oracle Forms? One of the nice features of Forms was the use of GLOBAL items. More or less comparable to Application Items in APEX. These GLOBALS where often used to pre-query data. For example you queried Employee 200 in Form A, then opened Form B and on opening that Form the Employee field is filled with that (GLOBAL) value of 200 and the query was executed. So without additional keys strokes or entering data, when switching to another Form a user would immediately see the data in the same context. And they loved that. In APEX you can create a similar experience using Application Items (or an Item on the Global Page) for Classic Reports (by setting a Default Value to a Search Item) and Interactive Reports (using the  APEX_IR.ADD_FILTER  procedure). But what about the Interactive Grid? There is no APEX_IG package ... so the first thing we have to figure out is how can we set a filter programmatically? Start with creating an Interactive Grid based upon the good old Employ

Stop using validations for checking constraints !

 If you run your APEX application - like a Form based on the EMP table - and test if you can change the value of Department to something else then the standard values of 10, 20, 30 or 40, you'll get a nice error message like this: But it isn't really nice, is it? So what do a lot of developers do? They create a validation (just) in order to show a nicer, better worded, error message like "This is not a valid department".  And what you then just did is writing code twice : Once in the database as a (foreign key) check constraint and once as a sql statement in your validation. And we all know : writing code twice is usually not a good idea - and executing the same query twice is not enhancing your performance! So how can we transform that ugly error message into something nice? By combining two APEX features: the Error Handling Function and the Text Messages! Start with copying the example of an Error Handling Function from the APEX documentation. Create this function