Skip to main content

Collaborate 2011 - Day 2

The second day of Collaborate started with a opening general session by Dan Thurmon. Dan is a so called "motivational speaker". And he sure knows is job. In a very American style he promoted his dogma:"Off Balance, On Purpose", demonstrated with juggling (even with axes on a one-wheeler). A very funny start of this day!
The first "real" session I attended was SQL Techniques by Tom Kyte. It was all about Clustering, setting up Index Organized Tables (IOT's) and Partitioning. The goal of all these techniques is to reduce IO. A nice metaphore he used was: You can put your clothes in a closet by just dumping them on the first free spot you see. So inserts are fast, but then retrieval trakes a full scan of your closet. By clustering pants, sweaters and socks together, inserts may be slower, but retrieval is way faster! But not only picking the right storage approach is important, also the retrieval - like using bulk/array fetching - are both important to reduce resources and increase performance and thus scalability. Don't tune queries, tune your approach!
Next on the agenda was Analytic Functions - Revisited. I am already familiar with this technique, but there still are some hidden gems to explore, like the package dbms_frequent_itemset. This package contains functions to find correlated data. I have to check that once. Also a warning: If you use analytic functions in a view, that might prevent the use of indexes.
The last before lunch was called Why BPEL when I can PL/SQL?. It was a 30 minute power session, clarifying when BPEL might be a better choice over PL/SQL. If you need advanced Fault Managemenrt, using multiple non-Oracle techniques (like SQL Server, DB2 etc) or Human Interaction, BPEL has advantages over PL/SQL.
The first presentation after lunch was Tips and Techniques Integrating Oracle XML DB. Due to some (or a lot of) technical difficulties the presenter couldn't do his demo in SQL Developer and had to stick to the slides - which contained lots of code samples. Couldn't bring his point that way. 
Metadata Matters was originally a Tom Kyte presentation (I think I've seen it before), but now done by one of his Oracle colleagues. Very well done I must say. The point was: Put as much (meta)information into the database as you can, and let the optimizer decide what to do with it. The more it knows, the better it can do it's job! So put in all constraints, and even datatypes are constraints! And also "size does matter". If a developer defines all strings as varchar2(4000) - just to be sure - then the amount of memory needed for returning a result set is way bigger than when you specify the real lenght. Check constraints can be used by the optimizer to rewrite queries - and reduce IO and speed up performance. If you define columns as NOT NULL you create more access paths for the optimizer, like using and index scan instead of a table scan. You can even "tweak" this for columns that can be NULL, by appending a fixed value to thee column value when creating the index - just to be sure every row is represented in the index.
Referential constraints can be used to remove tables from the query plan. And constraints can also be used for applying a query rewrite (so a materialized view can be used for delivering the result set instead of scanning the original table(s)). 
Most important: Java applications shouldn't handle all integrity and rules, because they can't do it as fast and as good as the Oracle database does that job. Don't try to reinvent that wheel, Oracle has over 25 years of experience in that field and it's all at your disposal! So don't think you can do better within the timeframe of your project!
The last presentation of today was Giving Winning Presentations by Rich Niemiec. Funny, entertaining and with lost of tips on how to make your point during a presentation. I don't like his slides though.... ;-)
Now off to the Welcome Reception!
Post a Comment

Popular posts from this blog

Dockerize your APEX development environment

Nowadays Docker is everywhere. It is one of the main components of Continuous Integration / Continuous Development environments. That alone indicates Docker has to be seen more as a Software Delivery Platform than as a replacement of a virtual machine.

However ...

If you are running an Oracle database using Docker on your local machine to develop some APEX application, you will probably not move that container is a whole to test and production environments. Because in that case you would not only deliver a new APEX application to the production environment - which is a good thing - but also overwrite the data in production with the data from your development environment. And that won't make your users very excited.
So in this set up you will be using Docker as a replacement of a Virtual Machine and not as a Delivery Platform.
And that's exactly the way Martin is using it as he described in this recent blog post. It is an ideal way to get up and running with an Oracle database …

Refresh selected row(s) in an Interactive Grid

In my previous post I blogged about pushing changed rows from the dabatase into an Interactive Grid. The use case I'll cover right here is probably more common - and therefore more useful!

Until we had the IG, we showed the data in a report (Interactive or Classic). Changes to the data where made by popping up a form page, making changes, saving and refreshing the report upon closing the dialog. Or by clicking an icon / button / link in your report that makes some changes to the data (like changing a status) and ... refresh the report.  That all works fine, but the downsides are: The whole dataset is returned from the server to the client - again and again. And if your pagination size is large, that does lead to more and more network traffic, more interpretation by the browser and more waiting time for the end user.The "current record" might be out of focus after the refresh, especially by larger pagination sizes, as the first rows will be shown. Or (even worse) while you…

Using multiple Authentication Schemes for your APEX application

Recently someone asked me how he could implement multiple authentication schemes for his APEX application. He would like to use (some kind of) Single Sign-on authentication and - as an alternative - an Application Express Authentication. The problem is ... you can only define one Authentication Scheme being "Current" for an application! So how can we solve this issue?

First, we need te be aware that multiple applications can share their authentication by using the same cookie. Thus if you specify "MYCOOKIE" as the Cookie Name in Application A as well as in Application B, you can switch from A to B and back without the need of logging in again. It doesn't matter what Authentication Scheme Type you are using!

Knowing this, we are halfway our solution. We need two Applications. One - the "real" application - using the Application Express Authentication, let's name this one "LAUNCHPAD". And another one using the Single Sign-on Authentication…