International Journal of Engineering
Trends and Technology

Research Article | Open Access | Download PDF

Volume 67 | Issue 1 | Year 2019 | Article Id. IJETT-V67I1P210 | DOI : https://doi.org/10.14445/22315381/IJETT-V67I1P210

Big Data: Data Science Applications and Present Scenario


Shubhankar Chaturvedi and Shwetank Kanava

Citation :

Shubhankar Chaturvedi and Shwetank Kanava, "Big Data: Data Science Applications and Present Scenario," International Journal of Engineering Trends and Technology (IJETT), vol. 67, no. 1, pp. 57-59, 2019. Crossref, https://doi.org/10.14445/22315381/IJETT-V67I1P210

Abstract

In this paper we are presenting some simple study of data science which has been discussed very frequently in scientific community. We are also giving some recent trends and techniques and their impact on scientific as well as social community.

Keywords

Big Data

References

[1] J. Abello, P. M. Pardalos, and M. G. Resende, Handbook of Massive Data Sets, ser. Massive Computing. Springer, 2002, vol.
[2] C. C. Aggarwal and P. S. Yu, Eds., Privacy-Preserving Data Mining: Models and Algorithms. Springer, 2008.
[3] C. C. Aggarwal, Data streams: Models and algorithms. Springer, 2007, vol. 31.
[4] Australian Department of Immigration, ?Fact sheet 70 - managing the border,? Internet, 2013. [Online]. Available: http://www.immi.gov.au/media/factsheets/70border.htm
[5] J. Bader and E. Zitzler, ?HypE: An algorithm for fast hyper volume-based many objective optimization,? Evolutionary Computation, vol. 19, no. 1, pp. 45– 76, 2011.
[6] P. Baldi and P.J. Sadowski, ?Understanding drop out, ?in Advances in Neural Information Processing Systems26. Cambridge, MA: MIT Press, 2013, pp. 2814–2822.
[7] M. Banko and E. Brill, ?Scaling to very very large corpora for natural language disambiguation,? in Proceedings of the 39th Annual Meeting of the Association for Computational Linguistics, Toulouse, France, 2001, pp. 26–33.
[8] A.-L. Barab´asi, Linked: The New Science Of Networks. Basic Books, 2002.
[9] M. Basu and T. K. Ho, Data Complexity in Pattern Recognition. London, UK: Springer, 2006.
[10] Y. Bengio, ?Learning deep architectures for AI,? Foundations and Trends in Machine Learning,vol.2,no.1,pp. 1–127, 2009.
[11] E. Brynjolfsson, L. Hitt, and H. Kim, ?Strength in numbers: How does data-driven decision making affect firm performance?? Available at SSRN 1819486, 2011.
[12] T. Chai, Y. Jin, and S. Bernhard, ?Evolutionary complex engineering optimization: Opportunities and challenges,? IEEE Computational Intelligence Magazine, vol. 8, no. 3, pp. 12–15, 2013.
[13] N. V. Chawla, ?Data mining for imbalanced datasets: Anover view,?in Data Mining and Knowledge Discovery Handbook, O. Maimon and L. Rokach, Eds. US: Springer, 2005, pp. 853–867.
[14] N. V. Chawla et al., Learning on Extremes-Size and Imbalance-of Data. Florida, US: University of South Florida, 2002.
[15] Q. Da, Y. Yu, and Z.-H. Zhou, ?Learning with augmented class by exploiting unlabeled data,? in Proceedings of the 28th AAAI Conference on Artificial Intelligence, Quebec City, Canada, 2014.
[16] K. Deb, A. Pratap, S. Agarwal, and T. Meyarivan, ?A fast and elitist multiobjective genetic algorithm: NSGA-II,? IEEE Transactions on Evolutionary Computation, vol. 6, no. 2, pp. 182–197, 2002.
[17] T. G. Dietterich, ?Approximate statistical tests for comparing supervised classification learning algorithms,? Neural Computation, vol. 10, no. 7, pp. 1895– 1923, 1998.
[18] D. Easley and J. Kleinberg, Networks, crowds, and markets. Cambridge Univ Press, 2010.
[19] A. Farhangfar, L. Kurgan, and J. Dy, ?Impact of imputation of missing values on classification error for discrete data,? Pattern Recognition, vol. 41, no. 12, pp. 3692–3705, 2008.
[20] L. Feng, Y.-S. Ong, I. Tsang, and A.-H. Tan, ?An evolutionary search paradigm that learns with past experiences,? in IEEE Congress on Evolutionary Computation, Brisbane, QLD, Australia, 2012, pp. 1–8.
[21] M. M. Gaber, A. Zaslavsky, and S. Krishnaswamy, ?Mining data streams: A review,? ACM SIGMOD Record, vol. 34, no. 2, pp. 18–26, 2005.
[22] W. Gao, R. Jin, S. Zhu, and Z.-H. Zhou, ?One-pass AUC optimization,? in Proceedings of the 30th International Conference on Machine Learning, Atlanta, GA, 2013, pp. 906–914.
[23] W. Gao and Z.-H. Zhou, ?Dropout Rademacher complexity of deep neural networks,? CORR abs/1402.3811, 2014.
[24] J. A. Hanley and B. J. McNeil, ?A method of comparing the areas under receiver operating characteristic curves derived from the same cases,? Radiology, vol. 148, no. 3, pp. 839–843, 1983.
[25] G. E. Hinton, P. Dayan, B. J. Frey, and R. M. Neal, ?The ?wake-sleep? algorithm for unsupervised neural networks,? Science, vol. 268, no. 5214, pp. 1158– 1161, 1995.
[26] G. E. Hinton and R. R. Salakhutdinov, ?Reducing the dimensionality of data with neural networks,? Science, vol. 313, pp. 504–507, 2006.
[27] H. Ishibuchi, M. Yamane, N. Akedo, and Y. Nojima, ?Many-objective and many-variable test problems for visual examination of multiobjective search,? in IEEE Congress on Evolutionary Computation, Cancun, Mexico, 2013, pp. 1491–1498.
[28] H. Ishibuchi and T. Murata, ?Multi-objective genetic local search algorithm,? in Proceedings of IEEE International Conference on Evolutionary Computation, Nagoya, Japan, 1996, pp. 119–124.

Time: 0.0013 sec Memory: 40 KB
Current: 1.91 MB
Peak: 4 MB