Database Technologies and Information Management Group

Welcome! The DTIM research group in Universitat Politècnica de Catalunya (UPC) conducts research in many fields of data and knowledge management, with particular emphasis on big data management, NOSQL, data warehousing, ETL, OLAP tools, multidimensional modeling, conceptual modeling, ontologies and services. DTIM is a research subgroup of the Software and Service Engineering Group (GESSI) UPC research group and its members belong to the ESSI and EIO departments of the same university.

Nobody educates anybody -nobody educates himself-, men educate each other under the mediation of the World.
Paulo Freire. Pedagogia del oprimido. Montevideo: Tierra Nueva, 1970.


Latest News (see all)

  • Jul12
    Databases: Past, Present and Future
    Oscar Romero gives a talk in the Computer Science School of Barcelona about the evolution of database field, in the framework of the 40th aniversary of the school.
  • Jun29
    Collaboration agreement signed with Probitas
    Agreement signed with Probitas Foundation to improve and extend their software to manage analysis results in the framework of the Global Laboratory Initiative.
  • Jun28
    Besim Bilalli's PhD Thesis defense at UPC
    Title: Learning the impact of data pre-processing in data analysis Author: Besim Bilalli Programme: Erasmus Mundus Doctoral Degree in Information Technologies for Business Intelligence Department: Department (more)

Latest Blog Posts

Measuring mongoDB query performance

MongoDB has become one of the widely used NoSQL data stores now. The document-based storage provides high flexibility as well as complex query capabilities. Unlike in Relational Database Management Systems (RDBMS), there is no formal mechanism for the data design and query optimization for MongoDB. Hence, we have to rely on the information provided by the datastore and fine-tune the performance by ourselves. The query execution planner and the logger offers valuable information to achieve the goal of having optimal query performance in MongoDB.

In the world of research or otherwise known as the world of "publish or perish", publishing a paper is of "life or death" importance. One of the most relevant measures of success in this world is the number of publications one has, and therefore the goal of every PhD student is to maximize it. Yet, given the time constraints a PhD student has, one has to perfectly plan the venues where he/she is going to publish. This is not a trivial problem. The problem aggravates when one has to publish a Journal article.  Therefore, we developed a tool that facilitates students to find the right journal given their time constraints.

In the age of Machine Learning and AI, companies are racing towards better services and innovative solutions for better customer experiences. Businesses realize the need to take their big data insights further than they have before in order to serve, retain and win new customers. 2017 has been a big year for Big data analytics with lots of companies understanding the value of storing and analysis huge stream of data collected from different sources. Big data is in a constant mode of evolution. It is expected that the big-data market will be worth $46.34 billion by 2018.

Our favourite tweets