Massachusetts Institute of Technology (MIT) researchers are creating DBSeer, a system that could alleviate cloud computing inefficiencies that arise from the overprovisioning that occurs with database-intensive applications. DBSeer also could help lower the cost of cloud services and facilitate the diagnosis of application slowdowns.
Database-driven applications in the cloud can result in the allocation of about 20 times more hardware than should be required, because server resources are allocated according to an application’s peak demand. Increased demand on database-heavy applications significantly slows the system as requests require multiple modifications of the same data on different servers.
DBSeer uses machine-learning techniques to create accurate models of performance and resource demands of database-driven applications.
The MIT researchers used two techniques to forecast a database-driven application's response to a load increase. The first technique is a black box approach in which DBSeer tracks changes in the number and type of user requests and system performance, correlating the two via machine-learning techniques. This black box technique excels at forecasting the impact of smaller fluctuations. However, for predicting the consequences of greater demand increase, the researchers use a gray box technique that factors in the unique qualities of a specific database system.
From MIT News
View Full Article
Abstracts Copyright © 2013 Information Inc., Bethesda, Maryland, USA
No entries found