Results Among 550 pregnant women surveyed, 347 (63.1%) of the pregnant women slept under a long-lasting insecticidal net the night before the survey. Urban residence (OR [95% CI] = 1.9 [1.22-3.01]), family size of 3-5 and >5 (2.8 [1.53-5.22] and 2.4 [1.20-5.03], respectively), and history of malaria during their current pregnancy (3.0 [1.95-4.86]) were found to be the factors associated with pregnant women's long-lasting insecticidal net utilization. Conclusion Utilization of long-lasting insecticidal net was low, and place of residence, exposure status to malaria during their current pregnancy, and family size were the factors associated with long-lasting insecticidal net utilization.In recent years, due to an increase in the incidence of different cancers, various data sources are available in this field. Consequently, many researchers have become interested in the discovery of useful knowledge from available data to assist faster decision-making by doctors and reduce the negative consequences of such diseases. Data mining includes a set of useful techniques in the discovery of knowledge from the data detecting hidden patterns and finding unknown relations. However, these techniques face several challenges with real-world data. Particularly, dealing with inconsistencies, errors, noise, and missing values requires appropriate preprocessing and data preparation procedures. In this article, we investigate the impact of preprocessing to provide high-quality data for classification techniques. A wide range of preprocessing and data preparation methods are studied, and a set of preprocessing steps was leveraged to obtain appropriate classification results. The preprocessing is done on a real-wignificantly improved after data preprocessing, especially in terms of sensitivity, F-measure, precision, and G-mean measures.In recent years, hyperspectral imaging (HSI) has been shown as a promising imaging modality to assist pathologists in the diagnosis of histological samples. In this work, we present the use of HSI for discriminating between normal and tumor breast cancer cells. Our customized HSI system includes a hyperspectral (HS) push-broom camera, which is attached to a standard microscope, and home-made software system for the control of image acquisition. Our HS microscopic system works in the visible and near-infrared (VNIR) spectral range (400 - 1000 nm). Using this system, 112 HS images were captured from histologic samples of human patients using 20× magnification. Cell-level annotations were made by an expert pathologist in digitized slides and were then registered with the HS images. A deep learning neural network was developed for the HS image classification, which consists of nine 2D convolutional layers. Different experiments were designed to split the data into training, validation and testing sets. In all experiments, the training and the testing set correspond to independent patients. The results show an area under the curve (AUCs) of more than 0.89 for all the experiments. https://www.selleckchem.com/products/epacadostat-incb024360.html The combination of HSI and deep learning techniques can provide a useful tool to aid pathologists in the automatic detection of cancer cells on digitized pathologic images.Hyperspectral imaging (HSI), which acquires up to hundreds of bands, has been proposed as a promising imaging modality for digitized histology beyond RGB imaging to provide more quantitative information to assist pathologists with disease detection in samples. While digitized RGB histology is quite standardized and easy to acquire, histological HSI often requires custom-made equipment and longer imaging times compared to RGB. In this work, we present a dataset of corresponding RGB digitized histology and histological HSI of breast cancer, and we develop a conditional generative adversarial network (GAN) to artificially synthesize HSI from standard RGB images of normal and cancer cells. The results of the GAN synthesized HSI are promising, showing structural similarity (SSIM) of approximately 80% and mean absolute error (MAE) of 6 to 11%. Further work is needed to establish the ability of generating HSI from RGB images on larger datasets.Mitral valve repair or replacement is important in the treatment of mitral regurgitation. For valve replacement, a transcatheter approach had the possibility of decrease the invasiveness of the procedure while retaining the benefit of replacement over repair. However, fluoroscopy images acquired during the procedure provide no anatomical information regarding the placement of the probe tip once the catheter has entered a cardiac chamber. By using 3D ultrasound and registering the 3D ultrasound images to the fluoroscopy images, a physician can gain a greater understanding of the mitral valve region during transcatheter mitral valve replacement surgery. In this work, we present a graphical user interface which allows the registration of two co-planar X-ray images with 3D ultrasound during mitral valve replacement surgery.Guided biopsy of soft tissue lesions can be challenging in the presence of sensitive organs or when the lesion itself is small. Computed tomography (CT) is the most frequently used modality to target soft tissue lesions. In order to aid physicians, small field of view (FOV) low dose non-contrast CT volumes are acquired prior to intervention while the patient is on the procedure table to localize the lesion and plan the best approach. However, patient motion between the end of the scan and the start of the biopsy procedure can make it difficult for a physician to translate the lesion location from the CT onto the patient body, especially for a deep-seated lesion. In addition, the needle should be managed well in three-dimensional trajectories in order to reach the lesion and avoid vital structures. This is especially challenging for less experienced interventionists. These usually result in multiple additional image acquisitions during the course of procedure to ensure accurate needle placement, especially when multiple core biopsies are required. In this work, we present an augmented reality (AR)-guided biopsy system and procedure for soft tissue and lung lesions and quantify the results using a phantom study. We found an average error of 0.75 cm from the center of the lesion when AR guidance was used, compared to an error of 1.52 cm from the center of the lesion during unguided biopsy for soft tissue lesions while upon testing the system on lung lesions, an average error of 0.62 cm from the center of the tumor while using AR guidance versus a 1.12 cm error while relying on unguided biopsies. The AR-guided system is able to improve the accuracy and could be useful in the clinical application.