I have experience of used internet, network,database, management and artificial intelligence.
I love every kind of sports like game, swimming, football and running.
Learning English by AI
How we can use AI in Education
The developing of deep learning systems that used for chronic diseases diagnosing is challenge. Furthermore, the localization and identification of objects like white blood cells (WBCs) in leukemia without preprocessing or traditional hand segmentation of cells is a challenging matter due to irregular and distorted of nucleus. This paper proposed a system for computer-aided detection depend completely on deep learning with three models computer-aided detection (CAD3) to detect and classify three types of WBC which is fundamentals of leukemia diagnosing. The system used modified you only look once (YOLO v2) algorithm and convolutional neural network (CNN). The proposed system trained and evaluated on dataset created and prepared specially for the addressed problem without any traditional segmentation or preprocessing on microscopic images. The study proved that dividing of addressed problem into sub-problems will achieve better performance and accuracy. Furthermore, the results show that the CAD3 achieved an average precision (AP) up to 96% in the detection of leukocytes and accuracy 94.3% in leukocytes classification. Moreover, the CAD3 gives report contain a complete information of WBC. Finally, the CAD3 proved its efficiency on the other dataset such as acute lymphoblastic leukemia image database (ALL-IBD1) and blood cell count datasets (BCCD).
The development of machine learning systems that used for diagnosis of chronic diseases is challenging mainly due to lack of data and difficulty of diagnosing. This paper compared between two proposed systems for computer-aided diagnosis (CAD) to detect and classify three types of white blood cells which are fundamental of an acute leukemia diagnosis. Both systems depend on the You Only Look Once (YOLOv2) algorithm based on Convolutional Neural Network (CNN). The first system detects and classifies leukocytes at the same time called computer-aided diagnosis with one model (CADM1). The second system separates detection and classification by using two models called computer-aided diagnosis with two models (CADM2). The main purpose of the paper is proving the high performance and accuracy by fragmentation of the main task into sub-tasks through comparing between CADM1 and CADM2. Also, the paper proved that can be depending only on deep learning without any traditional segmentation and preprocessing on the microscopic image. The (CADM1) achieved average precision for detection and classification class1=56%, class2=69% and class3 72% while (CADM2) achieved average precision up to 94% for detect leukocytes and accuracy 92.4% for classification. The result of the second system is very suitable for diagnosis leukocytes in leukemia.
The cloud is the best method used for the utilization and organization of data. The cloud provides many resources for us via the internet. There are many technologies used in cloud computing systems; each one uses a different kind of protocols and methods. Many tasks can execute on different servers per second, which cannot execute on their computer. The most popular technologies used in the cloud system are Hadoop, Dryad, and another map reducing framework. Also, there are many tools used to optimize the performance of the cloud system, such as Cap3, HEP, and Cloudburst. This paper reviews in detail the cloud computing system, its used technologies, and the best technologies used with it according to multiple factors and criteria such as the procedure cost, speed cons and pros. Moreover, A comprehensive comparison of the tools used for the utilization of cloud computing systems is presented.
The last days, the data and internet are become increasingly growing which occurring the problems in big-data. For these problems, there are many software frameworks used to increase the performance of the distributed system. This software is used for available of large data storage. One of the most beneficial software frameworks used to utilize data in distributed systems is Hadoop. This software creates machine clustering and formatting the work between them. The Hadoop consists of two major components which are Hadoop Distributed File System (HDFS) and Map Reduce (MR). By Hadoop, we can process, count and distribute of each word in a large file and know the number of affecting for each of them. In this paper, we will explain what is Hadoop and its architectures, how it works and its performance analysis in a distributed system according to many authors. In addition, assessing each paper and compare with each other.
T here are many technologies which are used on the Internet to share files, each of them have different features, methods and protocols. However, the most common and easiest one is the Web which was establish ed by few simple features T he Web continuously developing to be as much as easy for the us er s. The Web developer s want to make a machine which thinks like human s by adding new tools, methods and protocols to the current Web . This paper focus es on the most widely used technologies in the Web , and presents the stage s of the development of the World Wide Web Moreover, the evolution of the Web from W eb1.0 to W eb3.0 and semantic web is revealed . T h e paper, in addition, explain s the technologies and tools of the Web and compare s between Web portal s and search engines. Finally, assessment of the activation period for each one is presented
Social Link
ResearchGate
ORCID
Google Scholar
Publons
Linkedin
Scopus