Monday, October 31, 2011

In two weeks: My first DOAG Conference

From November 15 to 17, the annual DOAG (German Oracle User Group) Conference will be held in Nuremberg. I've never attended this conference before, so I am really curious how my experience will be - especially compared to the other big European Oracle event, the UKOUG conference three weeks later. The agenda looks very promising. A very neat and tight three full days, with a new 45 minute session starting on the hour. Just like you expect from Germans ;-). And with 20 parallel sessions it seems about the same size as the UKOUG. The nice thing is, there are a lot of (German) speakers on the list, I've never seen before. So that could be interesting. And there are a lot of APEX sessions as well - see my schedule below, where I colored all the APEX sessions in blue! Some sessions will be translated - from German to English, some of them will be in English, but the majority will be in German. So a good opportunity to dust off my German... In the flyer, sessions are marked when they're in English. Mine isn't but I wasn't planning on doing it in German... But we'll see, maybe I'll end up in "Genglish" or "Engman"...
You can take a look at the whole program here.
Oh, yeah, one shameless plug : My session on the XFILES (about the power of XML-DB used for creating an APEX Application - and with version control for APEX itself as an example) is Tuesday at 4PM... 

Tuesday, October 25, 2011

Analytic function to the rescue! Again.

My current APEX project has a requirement to show a chart on one of the pages. No big deal. Usually. Because it should represent some value over time and that value would be stored in the database....could be every second, this chart could contain 10,000's, if not 100,000's points. 
So generating the XML, transferring the XML to the browser and interpreting the XML by the chart engine....was slow...
So I had to come up with a solution to reduce the number of points, without destroying the goal of the chart. Oh, and did I mention that the value could be stored every second, but could also be every minute, hour, whatever?
The first thing I came up with was the SAMPLE clause. Never heard of it and never used it before. You can just do a SELECT * FROM EMP SAMPLE(10) and as a result, you'll get 10% of the rows of the EMP table. The only withdrawal with that was, the result could be different every time. So when refreshing a chart, the chart could look really different. Another, more minor, hiccup was, the sample size  should be hard-coded and couldn't be parametrised (another "smaller" subrequirement).
So after some research I stumbled upon an Analytic Function, that might do the trick: NTILE( number ). This function "divides an ordered data set into a number of buckets indicated by the number parameter and assigns the appropriate bucket number to each row" (quote from the documentation - couldn't say it better). So using this function, you can equally divide 100,000 records in 25 buckets - order by timestamp. And once you've done that you can easily calculate the average value per bucket. And the average timestamp as well. And just use these values to generate a more minimalistic XML document... O, and another fine thing: you can pass any number parameter to NTILE, so using the same query, generate either 10 or 10,000 points...

As a - functionally useless - example a query to generate 15 rows with the average object_id and the average create date of all_objects:
select  to_char( startdate + ( enddate - startdate )/2
               , 'YYMMDD HH24:MI:SS' ) label
,       average
select min(created)  startdate
,      max(created)  enddate
,      avg(value)    average
select object_id value
,      created
,      ntile( 15 ) over ( order by created ) as bucket
from all_objects
group by bucket
order by 1

Thursday, October 20, 2011

"I am running for the ODTUG Board of Directors"

No, not it's not about myself, but that's what my dear friend Martin Giffy D'Souza told me during last Oracle Open World. And I support his nomination whole hearted! 
I know Martin since about 4 years since we met at one of the KScope conferences. Since then we'll meet each other twice a year, but are in touch more frequently by email, twitter and other social networks. Martin is passionate about his job and, knows really a lot about APEX and is always willing to share his thoughts, ideas and vision with others. And I think nobody is better suited for the ODTUG Board as Martin is!
If you wonder what Martin looks like...see the picture below, taken during the OOW Appreciation Event. He is the guy with the blue circle around his head ;-)

If you are an ODTUG member, you can vote here

You can read Martin's official campaign and bio below...

Campaign Statement
I have attended ODTUG Kaleidoscope for several consecutive years and have been a presenter for the last three. The conference has allowed me to develop strong relationships with many others in the community, and the importance of these relationships has proven invaluable. I continually strive to give back to the community, using my personal time to answer questions through email, blogs, the Oracle forums, and by writing technical books. I would like to continue this spirit of giving back by joining the ODTUG Board of Directors.
As a new board member I will bring a fresh perspective and out-of-the-box ideas to help promote ODTUG and deliver our message to the world. I am fortunate enough to have a successful blog with several thousand unique monthly visitors. It is through this platform, along with other opportunities such as my consulting firm blog, social networking, and the multiple annual conferences that I attend from which I intend to help share the ODTUG mission and values.
The Board plays a pivotal leadership role as both a driving force and a face of the ODTUG community. I feel that my professional experience as a leader and mentor will help the Board guide and develop ODTUG for the future. The Board has responsibility to its most important group - the members. I will help ensure that the Board serves as both a voice and an ear for the entire ODTUG community; developers, DBAs, and technical experts of all things Oracle.
Many thanks for your consideration.

Biographical Statement
Martin Giffy D’Souza is an Oracle ACE and award winning presenter and speaker. Most recently Martin was honored with the ODTUG Kaleidoscope 2011 Presenter of the Year award. Martin also serves as a Co-founder & CTO at ClariFit Inc., a consulting firm specializing in Oracle solutions. Martin’s career has seen him hold a range of positions within award winning companies and his experience in the technology industry has been focused on developing database-centric web applications using the Oracle technology stack. Martin is the author of the highly recognized blog, and he has co-authored several APEX books including Expert Oracle Application Express, a collaboration of some of the most renowned APEX developers in the industry. He has presented at numerous international conferences such as ODTUG, APEXposed, and COUG. Martin holds a Computer Engineering degree from Queen’s University in Ontario, Canada.

Thursday, October 06, 2011

OOW2011 - Announcing the APEX Marketplace

With the Oracle Database Cloud Service, Oracle also announced the APEX marketplace - a sort of App Store for APEX or APpEX Store if you like. When the Cloud Service is released, there will be number of free APEX Applications available for install. These are all created by Oracle itself and are very similar to the "Packaged Applications" that where available on OTN earlier. So if you're wondering why they're not available up there anymore and where they are gone: You got your answer now!
Just like any other APEX Application you can (probably) still export that application and install it on your own environment. But they will ne "locked down", although it's not quite clear what that actually means. Probably, to prevent support issues due to your own changes, you aren't allowed to make changes. BTW, not all Markterplace Applications will be available on OTN, some will, others won't ..
Another, not yet implemented, idea is that you can also upload your own application into the Marketplace. The application will be reviewed by Oracle before it's made available. Your application won't actually be up there, but an interested user will be redirected to your own site and be able to download it from there. And the idea is that you can either provide these applications for free or charge for it. Don't expect it to make you a millionaire, but, hey, that should go dollar-by-dollar anyway...
As a teaser a screenshot of the applications that will be available from the start:

Five things you (probably) don't know about PL/SQL

This post is a (live) report from Tom Kyte's session with the title above he did in a packed room on Thursday morning on OOW2011.

1. Trigger trickery
A before row trigger uses consistent read. So it uses the situation as it was when the statement started. So during long running updates the actual situation might differ from the 'consistent read' situation. That might lead to a rollback and re-fire of the statement, and thus the trigger as well. So every before statement and row level trigger (apart from the last row) might fire twice!
So don't do anything you can't roll back in a trigger. If you call some autonomous auditing function in a trigger, you might encounter rows in your auditing table that didn't actually happen...
Direct path loads bypass triggers don't always fire!
So, if you can avoid triggers...please avoid triggers.

2. Ignore errors
Error handling is done wrong more often than it's done right. Only catch exceptions that you are expecting - which means they aren't real exceptions anymore (like a NO_DATA_FOUND, but then in the same block as the SQL statement itself and not a general one at the end of your code). All the other ones should be raised, either immediately or at a later moment in the transaction.

3. Elaboration code
You can use instantiation code in a package body, by defining an anonymous block within the body. It runs one time only per session (and after ever reinstantiation).

4. Implicit conversions
Always use explicit conversions. Especially when you're relying on specific NLS settings, for example when converting dates! Implicit conversions might even lead to SQL injection by tweaking the NLS_DATE setting!!!
Relying on implicit conversion also might have a performance penalty, because the conversion takes CPU time and has impact on the access path the optimizer comes up with.

5. Roles
You need direct grants on an object in order to use that object in a PL/SQL object, roles don't work - on purpose. If you use "invoker rights", your code uses the roles and grants of the user who runs the code. Default code is created using "defined rights" and then the code uses the grants of the definer. Especially when you use invoker rights, you could encounter unexpected results, because table T of the definer might not be the same table T of the invoker....

Location:Cyril Magnin St,San Francisco,United States

OOW2011 - Announcing SQL Developer 3.1 New Features

The new SQL Developer version 3.1, contains a lot of new functionality. For instance a lot of DBA functionality that was already available within Enterprise Manager is exposed in SQL Developer. The developers are using the exact same code, but with a SQL Developer skin on top of it.
Another very neat feature is the SQL Developer Cart. You can drag and drop any object in the cart and it will automagically create a sql script file and zip it. I see certainly a use for that, for instance to deploy an new version of an application : all files neatly packed together...
And of course, there is the Cloud Services. You can connect to your Oracle Database Cloud Service from within SQL Developer. Under the covers a RESTful web service is called and the results are processed. There is also functionality implemented to transfer data to and from the cloud, using DataPump. If you upload data into the cloud, the data pump file is automatically created, uploaded en processed. Very neat!

Location:Ellis St,San Francisco,United States

OOW2011 - Announcing the Oracle Public Cloud

Until this Wednesday, Oracle`s statement was "use Oracle systems to build your own cloud", but this is changed drastically now! Starting November 1, Oracle does this for you. Just subscribe and within 20 minutes you are up and running in the brand new "Oracle Public Cloud".
So what do you get after this 20 minutes of waiting? It depends on what you ordered ;-). Within the Oracle Public Cloud (OPC), the following options are available:
- an Oracle Database Cloud Service
- a Java Cloud Service
- a Social Connect Cloud Service, based on WebCenter
- some Fusion Applications Cloud Services: CRM and HCM.

The Oracle Database Cloud Service
The Oracle Database runs on Exadata. Within a database you get your own schema(s) and tablespace(s), so while the database itself is shared with others, all data is partitioned and caged. To enhance security, Transparent Data Encryption will be switched on, and DataVault might also be used (still under consideration). You can connect to the database over http, rest and JDBC. You can order a small, medium or large service - ranging from 50 up to 250 GB storage space. The data transfer is limited to 6 times the storage.
When using this service it might be important where your data is stored (for instance when you don`t want to expose your data under the American Patriot Act). Therefore, next to the current datacenter in Austin (TX), an European center will be opened in Edinburgh, and more local centers are planned.
You can access the database via the APEX Listener, using RESTful web services. So any language that "speaks" REST, can use this service. Another option is to sign up for an APEX environment. Within a few minutes, you can start developing or deploying your APEX application in the cloud. Similar to, but in this environment you are allowed to run production applications!

The Oracle Java Cloud Service
With the Oracle Java Cloud Service (OJCS) you get your own - prebuilt - Oracle Virtual Machine on an ExaLogic server. This OVM contains one or more WebLogic 11g servers. Just like the database counterpart, this environment will be (almost) instantly available and easy to use and manage. When signing up, or later, you can associate your OJCS with an Oracle Database Cloud Service or a Fusion Application Cloud Service. There will be a pre-built integration with Fusion Apps.

When signing up for one or more of these services, you can have a onetime free trial for 30 days. After that trial period (or immediately if you like), you will be charged per month for the services you`re signed up for. Both the Java and the Oracle Cloud Service comes in small, medium and large - and it is possible to up or downgrade. According to Oracle the pricing will be "competitive" - whatever that may mean...
Before this all goes live, there will be an Early Access period.

Apart from the Cloud Services itself, also a lot of tools will be "cloud enabled". There will be cloud add-ons for Eclipse and JDeveloper, Enterprise Manager will get a Cloud Control feature and SQL Developer 3.1 will have cloud support as well for up- and downloading data (using REST web services or using the new "Data Pump for the cloud".

So who`s this all for? I think the small version of the services could be very interesting for setting up a development environment within minutes. No need to order hard- an software when starting a project, or to reserve your space in the "private cloud" of your own company.
The medium and large versions are targeted at test or production systems. But with the current size limit, only small and medium businesses - or isolated departmental applications, can use this. And therefore it`s not only a competitor for Google and Amazon, but also for the smaller hosting companies. But that will be dependent on the price...

More information an sign up - when the time is there - on!

Location:Ellis St,San Francisco,United States

Tuesday, October 04, 2011

New APEX feature regarding RESTful web services

The upcoming Application Express version 4.2 - date not announced yet - will (or might), amongst other features regarding mobile support, also contain the functionality for managing the Apex Listener Resource Templates. From the SQL Workshop you can access the RESTful web services. And so create new or manage existing services of the APEX Listener. So therefore keep all stuff within one IDE.
Very nice. It will enhance the use of the Resource Templates enormously!

Location:Howard St,San Francisco,United States

Know your code : Automate PL/SQL standard enforcement

This is a report of Lewis Cunningham's session at OOW11 on the subject t mentioned above.
For code you not only need coding and naming standards, but performance and testing standards as well. And of course you need to check wether your code complies with your standards. For analysis you can either do static or dynamic analysis (and instrumentation as well). For static analysis you can use the data dictionary, PL/Scope and the source code itself. For dynamic analysis you have to use profilers.
For retrieving the data dictionary you can query the _SOURCE, _DEPENDENCIES, _PROCEDURES and related views. When using PL/Scope you have to recompile the code with a "identifier:all" setting switched on. Then the results are retrievable from the _IDENTIFIERS view.
Analysis should be done at the earliest stage, that is during coding. PL/Scope can be very handy for validating naming conventions, impact analysis and identifying scoping issues.
For dynamic analysis, you have to set a baseline first. Save the timings for that baseline and check that after you made changes. Code coverage is also useful to detect pieces of dead code. You can get this by subtracting the dynamic, executed, code from the static, available, code. Alas code coverage is not 100%, because PL/Scope doesn't detect SQL.
All this stuff should and can be automated....
All details are in the Expert PL/SQL book.

Location:Howard St,San Francisco,United States

Monday, October 03, 2011

OOW2011 - Announcing Oracle`s Big Data Appliance

During the Oracle ACE Director briefing, Mark Townsend, VP Database Product Management, did "State of the Union" on the Oracle database. This post is my report of his talk...
At this moment around 55% of the installed base of the Oracle database is on 11.2. Last year, Oracle made more money from selling more database licenses and more options on existing installations.
Mark mentioned that there are customers with 1,000`s of databases (and even one with 80,000!) - all using different versions of the database, of the operating system, storage etc. This situation is very hard to maintain and to keep up and running. Inn Oracle`s view, consolidation into a "private cloud" is the solution, and therefore Oracle offers Exadata. One (or less) databases are easier to secure, easier to make high available and easier to upgrade. And when you use Oracle software troughout your application stack, why not use Oracle hardware as well? So Oracle is striving towards a "red stack" (i.e. all Oracle).

The latest version of the Oracle database is is planned for somewhere next year. After that, Oracle 12 will replace the "g" with a "c" - for "cloud" of course! - and should be available somewhere next year as well. Oracle 12c is not a subject on this OpenWorld. You can sign up for the beta test, which will start in November.
Last week, the Oracle Database Appliance (ODA) was announced. The ODA comes with standard 24 cores, but you can license per core - completely different from the current licensing where you have to pay for all cores that are available in your hardware. In Oracle`s terminology, an "Appliance" is engineered for simplicity, anything called "Exa-whatever" is engineered for speed. Next to the ODA, Oracle announced this Monday the "Big Data Appliance", using a (new) Oracle NoSQL database (based on the Berkeley DB). This appliance will do massively parallel batch processing with Hadoop. Therefore Oracle will distribute Hadoop (and support it as well). There will be an Oracle Data Integrator (ODI) to get the data from Hadoop into a relational Oracle database. Another new product in this appliance is "Oracle R". "R" is open source replacement for SAS - a statistical tool for data-analysts (like the software used by the female computer wizard in the tv-series Criminal Minds). So the BDA consists of this whole stack (as I understood it). The BDA solution (or framework or architecture) is aimed at processing huge bulks of no-SQL data (key-value pairs), like user clicks on website, phone calls etc, but is good for oldfashioned ETL too!

Location:Ellis St,San Francisco,United States

OOW2011 - Announcing Oracle NoSQL

NoSQL databases have already been around for a long time. Even Oracle owns one: Berkeley DB. Other well known databases are Voldemort, MongoDB and Cassandra.
A NoSQL database contains only key-value pairs and targets on only simple operations: store and retrieve data. Any relationships and other rules should be enforced by the application itself. A NoSQL databases has a small footprint, is embeddable, (very) fast, scalable and easy to use and usually runs on a lot of operating systems.
Therefore the sweet spot of NoSQL databases is processing loads of simple and unstructured data, like messaging, queueing and user web clicks. Not surprising that the big social networks, like LinkedIn, Facebook, Google and Amazon are heavy users of NoSQL databases. For some more advanced use some NoSQL databases have options for concurrency, transactional processing and high availability. Of course you can store this kind of data in a relational database, like the "regular" Oracle database as well, but that comes with a much higher price tag. An Oracle database can do so much more than just store data, but even if you don`t need those features, you still have to pay for them...
This Monday Oracle announced their knew Big Data Appliance in order to acquire, organize and analyze large volumes of simple, unstructured data in ann easy way. Part of this appliance is the new Oracle NoSQL database, which is - surprise, surprise - based on Berkeley DB. But, unlike most competitors, an Oracle NoSQL has, next to a C++ and Java API, also a SQL API! So NoSQL doesn`t mean no SQL at all, but Not Only SQL...
Oracle NoSQL will be available in two versions: a Community Edition which is free and open source and an Enterprise Edition. The functionality is the same, there is only a difference in the licensing... I am very curious how this will land in the, usually very independent and open source minded, NoSQL world!
More info on Oracle OTN

Location:Ellis St,San Francisco,United States

OOW2011 - Announcing the Exalytics machine

After a long rerun about Exadata, Exalogic and the Supercluster, during Sunday`s keynote, Larry finally announced the new Exalytics machine. Extreme speed, due to 1 TB DRAM (holding 5 to 19 TB of compressed data) and 40 cores of Xeon CPUs. It is using a new version of TimesTen - the in memory database - and/or an new version of Essbase (for OLAP) and a new OBIEE. It not only handles relational and multidimensional data, but also "unstructured" data. You should connect the Exalytics machine to your Exadata machine with Infiniband. Then load (all) data into the Exalytics machine and start analyzing and processing in memory. It uses a "Heuristic Adaptive In-Memory Cache", so data changes are detected and refreshed in the machine. Oracle claims it is around 20 times faster than their current configurations.
Price tag? Not mentioned...

Location:Howard St,San Francisco,United States

Connecting NoSQL and Oracle Databases

NoSQL databases are very good in storing and retrieving large amounts of data. Analysis, on the other hand, is hardly possible. And you probably need that to actually make money out of Facebook, LinkedIn etc.
Therefore you need to transfer the data to a "regular" database, like Oracle.
There are two options to do this kind of stuff. The first one is with a MySQL Data Hub, that handles both the NoSQL and Oracle tables as external. Another option is to transfer the data to Oracle directly. Quest offers some (free) products in this area, like TOAD for Cloud Databases. Very interesting and definitely need to try this out. Thinking about APEX talking to a NoSQL database now...

Location:Howard St,San Francisco,United States