×
The submission system is temporarily under maintenance. Please send your manuscripts to
Go to Editorial ManagerAs internet network developed rapidly in the past ten years, and its operating environment is constantly changing along with the development of computer and communication technology, the congestion problem has become more and more serious. Since TCP is the primary protocol for transport layers on the internet, the data transmitted via the transport protocol utilizes Vegas Transmission Control Protocol (TCP) as the congestion control algorithm, where it uses increasing in delay round trip time (RTT) as a signal of network congestion. However, this congestion control algorithm will attempt to fill network buffer, which causes an increase in (RTT) determined by Vegas, thereby reducing the congestion window, and making the transmission slower, Therefore Vegas has not been widely adopted on the Internet. In this paper, an improved algorithm called TCP Vegas-A is proposed consist of two parts: the first part is sending the congestion window used by the algorithm for congestion avoidance along with the TTL (Time To Live) mechanism that limits the lifetime of a packet in the network. While the second part of the algorithm is the priority-based packet sending strategy, and jitter is used as a congestion signal indication. The combination of the two is expected to improve the efficiency of congestion detection. A mathematical model is established, and the analysis of the model shows that the algorithm has better effects on controlling congestion and improving the network throughput, decreasing packet loss rate and increasing network utilization, the simulation is done using NS-2 network simulation platform environment and the results support the theoretical analysis.
In the last two decades, underwater acoustic sensor networks have begun to be used for commercial and non-commercial purposes. In this paper, the focus will be on improving the monitoring performance system of oil pipelines. Linear wireless sensor networks are a model of underwater applications for which many solutions have been developed through several research studies in previous years for data collection research. In underwater environments, there are certain inherent limitations, like large propagation delays, high error rate, limited bandwidth capacity, and communication with short-range. Many deployment algorithms and routing algorithms have been used in this field. In this work a new hierarchical network model proposed with improvement to Smart Redirect or Jump algorithm (SRJ). This improved algorithm is used in an underwater linear wireless sensor network for data transfer to reduce the complexity in routing algorithm for relay nodes which boost delay in communication. This work is implemented using OMNeT++ and MATLAB based on their integration. The results obtained based on throughput, energy consumption, and end to the end delay.
This study evaluates the performance and efficiency of four deep learning models—VGG-16, ResNet-50, Inception-V3, and DenseNet-121—in detecting pneumonia from chest X-rays, addressing the critical need for balanced accuracy and computational efficiency in clinical diagnostics. Methods: A dataset of 5,234 chest X-rays (3,875 pneumonia, 1,341 normal) was augmented via rotation, flipping, and zooming to mitigate class imbalance. Models were trained on an RTX 2060 GPU for 40 epochs, with performance assessed using accuracy, F1 score, sensitivity, specificity, precision, and computational metrics (training time, memory usage). Statistical significance was validated via paired t-tests (p < 0.05). Results: DenseNet-121 achieved the highest accuracy (95.2% ± 0.8), F1 score (95.1% ± 0.7), and throughput (400 images/sec) with minimal memory usage (33MB). ResNet-50 and Inception-V3 showed moderate performance, while VGG-16 exhibited overfitting tendencies. In conclusion, DenseNet-121 showed strong performance compared to other models, both in terms of accuracy and processing speed, which is essential for use in real-time clinical settings. However, the small size of the validation set and limited population diversity are important limitations that should be addressed in future studies. Moreover, more testing on larger datasets is needed to confirm the stability of the model and see how the model will work in different settings. Future work should address ethical considerations in AI-driven diagnostics and validate findings across multi-institutional datasets.