My list of the top open challenges in IT... 1. Metadata Correctly handling metadata on a large scale and distributed basis is the fundation for giving a semantic web. This would enable a richer exchange network. 2. Awareness environment "awareness" means the relation of one object to its context. For instance .the security context under which a program runs .the available network ressources .The degree of optimizations available for a query .The links between this entry, for instance, and you, reader Having a consistent set of environmental service, and a built in "awareness" in common langage enables is crucial. This basically means that you are able to have separation of concerns, yet benefit of rich default functionalities. 3. Data mapping Being able to seamlessly create live between datasources is crucial, and maintain that (hopefully "aware") link is also a basic functionnality which has no obvious standard answer. It has huge impact on basically every domain you can think of. 4. Distributed computing There are tons of critical applications which require huge computing ressources : proteins folding, dna statistics Good news : There is a lot of it around the world.... Bad news : it is not available !! So finding a consistent way to perform distributed computation is a tremendous challenge. Let's abstract from the vision of the many computers and consider that you have a thousand-core processor. Now without a consistant way to perform distributed compuations, your apps don't go faster with it, only you can launch if you want a thousand apps... Now consider that this app that you want to go fast will make it possible - or not - to save millions of life every year... I guess now you can get an idea of the frustration the current situation generates. 5. Data and metadata merge Being able to merge data and metadata will make it possible for everyone to enrich the typing system, while consuming some other's data. This merge would basically enable to have distributedd modeling, for all those numerous domain where having a thousand consultants for a thousand years would not be enough. If you consider the workflow of creating software, it is very unsatisfactory today. The delegation issue between the requirer, the manager, the programme is very, very complex. This is a huge bottleneck for hig bandwidth software where you don't want to want for the next 3year release. Merging data and metadata enables knowledgeable end user to extend the built objects to their need. Hopefully with "awareness", some statistical computations would prevent the resulting mess and offer model consistency across user. 6. ObjectNet Being able to natively exchange object would definitely foster the web to a higher level. The promised land of Corba and the like might become a reality if the performant infrastructure make it usable.