Bøger af Nonita Sharma
-
1.290,95 kr. This book explores the complete system perspective, underlying theories, modeling, and applications of cyber-physical systems (CPS). Considering the interest of researchers and academicians, the editors present this book in a multidimensional perspective covering CPS at breadth. It covers topics ranging from discussion of rudiments of the system and efficient management to recent research challenges and issues. This book is divided into four sections discussing the fundamentals of CPS, engineering-based solutions, its applications, and advanced research challenges. The contents highlight the concept map of CPS including the latest technological interventions, issues, challenges, and the integration of CPS with IoT and big data analytics, modeling solutions, distributed management, efficient energy management, cyber-physical systems research, and education with applications in industrial, agriculture, and medical domains. This book is of immense interest to those in academia and industry.
- Bog
- 1.290,95 kr.
-
416,95 kr. Technical Report from the year 2017 in the subject Computer Science - Internet, New Technologies, grade: 8, , language: English, abstract: Tree boosting has empirically proven to be a highly effective and versatile approach for data-driven modelling. The core argument is that tree boosting can adaptively determine the local neighbourhoods of the model thereby taking the bias-variance trade-off into consideration during model fitting. Recently, a tree boosting method known as XGBoost has gained popularity by providing higher accuracy. XGBoost further introduces some improvements which allow it to deal with the bias-variance trade-off even more carefully. In this research work, we propose to demonstrate the use of an adaptive procedure i.e. Learned Loss (LL) to update the loss function as the boosting proceeds. Accuracy of the proposed algorithm i.e. XGBoost with Learned Loss boosting function is evaluated using test/train method, K-fold cross validation, and Stratified cross validation method and compared with the state of the art algorithms viz. XGBoost, AdaBoost, AdaBoost-NN, Linear Regression(LR),Neural Network(NN), Decision Tree(DT), Support Vector Machine(SVM), bagging-DT, bagging-NN and Random Forest algorithms. The parameters evaluated are accuracy, Type 1 error and Type 2 error (in Percentages). This study uses total ten years of historical data from Jan 2007 to Aug 2017 of two stock market indices CNX Nifty and S&P BSE Sensex which are highly voluminous.Further, in this research work, we will investigate how XGBoost differs from the more traditional ensemble techniques. Moreover, we will discuss the regularization techniques that these methods offer and the effect these have on the models.In addition to this, we will attempt to answer the question of why XGBoost seems to win so many competitions. To do this, we will provide some arguments for why tree boosting, and in particular XGBoost, seems to be such a highly effective and versatile approach to predictive modelling. The core argument is that tree boosting can be seen to adaptively determine the local neighbourhoods of the model. Tree boosting can thus be seen to take the bias-variance trade off into consideration during model fitting. XGBoost further introduces some improvements which allow it to deal with the bias-variance trade off even more carefully.
- Bog
- 416,95 kr.
-
306,95 kr. Prediction models have reached to a stage where a single model is not sufficient to make predictions. Hence, to achieve better accuracy and performance, an ensemble of various models are being used. Gradient Boosting Algorithm has almost been the part of all ensembles. Winners of Kaggle Competition are swearing by this. Extreme Gradient Boosting is a step forward to this where we try to optimise the loss function. In this research work Squared Logistic Loss function is used with Boosting function which is expected to reduce bias and variance. The proposed model is applied on stock market data for the past ten years. Squared Logistic Loss function with XGBoost promises to be an effective approach in terms of accuracy and better prediction.
- Bog
- 306,95 kr.
-
337,95 kr. - Bog
- 337,95 kr.