Journal of the College of Physicians and Surgeons Pakistan
ISSN: 1022-386X (PRINT)
ISSN: 1681-7168 (ONLINE)
Affiliations
doi: 10.29271/jcpsp.2025.06.679Globally, patient safety is the cornerstone of medical practice. As we move rapidly into the era of digital technology, more and more efforts are concentrated on how best AI can be integrated into patient safety practices.1 As a low-middle-income country (LMIC), Pakistan can improve upon several opportunities through which AI can be integrated for patient safety.
First of all, there is a growing digital health infrastructure, including initiatives such as the Sehat Sahulat Programme2 and Electronic Health Record Adoption3 in hospitals, which are able to provide a foundation for introducing AI-driven analytics. Moreover, the expansion of telemedicine through platforms, such as DoctHERS and Marham, use AI for diagnostics and triage which facilitates in improving access to care.4,5 Second, encouraging collaborative interdisciplinary initiatives, for example, IT tech professionals and healthcare researchers can lead to developing localised AI solutions for healthcare. Third, more and more government and private sector partnerships can support health-tech startups, such as P@SHA (Pakistan Software Houses Association) and IGNITE (National Technology Fund) have funded such startups.6,7
There are key areas where integrating AI into patient safety can be both helpful and useful. Diagnostic accuracy where AI can be used in Radiology, for example, detecting TB and lung cancer.8 One of the open-source models is CHEXpert (Chest X-ray Analysis) of Stanford University, which can be modified to the local setting.9 Similarly, OpenSlide (Open-Source AI for Pathology) can detect cancer and malaria in blood smears and can reduce human error.10
Medication safety is another area where using AI-powered clinical decision support systems can flag medicine interaction and dosage errors.11 MedMinder (Open-Source Drug Interaction Checker) can be helpful in reducing medication errors.12
For maternal and child health, AI-powered SMS chatbots, such as RapidPRoo by UNICEF, are freely available to send automated pregnancy alerts and vaccination reminders and work with basic phones, smartphones are not necessarily needed.13
Predictive analysis is another important area where AI can pre- dict sepsis, patient deterioration, or readmission risk in ICUs.14 FHIR Clinical Decision Support Hooks can be used to integrate with Pakistan EHR systems to alert doctors about patient risks such as sepsis and overdose.15 For public health and disease surveillance, Epitweetr (WHO Free AI) for outbreak prediction is available,16 which is used to predict dengue and cholera out-breaks, while Nextstrain,17 real-time virus tracking, is another open source available for monitoring COVID, polio, and flu variants.
AI can be used to support hospital workflow and adminis-tration,18 for example, Bahmni,19 open-source hospital management system, and AI analytics can be used to predict ICU bed shortages and patient waiting times. It is used in countries such as Africa and India, and can be customised for hospitals in Pakistan. Similarly, LibreHealth,20 EHR is freely available with AI alerts for Errors and is a better and cheaper alternative to expensive EHRs, which can facilitate in reducing misdiagnosis and duplicate tests.
When preparing to implement these tools, it is important to realise that pilot studies should be conducted first, which can be helpful in identifying areas that need to be addressed for localised context. In addition, training healthcare workers through short courses on using the AI tools are imperative to ensure correct and effective use. Local language support should be adhered for using chatbots, for example in Urdu, Sindhi, Balochi, and Pashto.
There are challenges which will need to be addressed. First, inter-net access is a major limitation and hence offline versions of these tools should be available, which would require customising the tools for local contexts. Second, national policies, regulations, and guidelines related to the integration of AI for patient safety must be developed. It should also include measures for data privacy in compliance with the Pakistan Data Protection Bill. Third, infrastructure and cost are an issue for public hospitals and for this, government and private sector organisations in partnership need to develop a strategic plan to initiate and implement as per the requirements of the public hospitals. Fourth, stakeholder acceptance is required to ensure they are used, hence both doctors and staff need training and willingness to work with these tools.
While these challenges may come across as overwhelming, the valuable lessons can be learnt from the other LMICs which have successfully integrated AI to improve patient safety. For example, in Kenya, Jacaranda Health’s Prompts is an SMS-based chatbot used to provide pregnancy advice and emergency alerts, which has reduced maternal mortality by 25% in piloted areas. In Rwanda, Zipline is an AI-powered Blood Delivery Drone which delivers blood and vaccines to remote areas and reduces blood delivery time from 4 hours to 15 minutes. Pakistan’s flood-prone and mountainous regions face supply chain issues and adapting such AI application to local contexts can improve vaccine and blood delivery. In Nigeria, RxAll is an AI drug-based authentication application which detects counterfeit medicines in seconds and has supported in reducing fake medicines. In Bangladesh, the implementation of UNICEF AI for dengue prediction is used to monitor, analyse, and predict dengue outbreaks, which helps in preparing hospitals with early alerts and warnings. In India, Qure.ai is used for chest x-ray analysis for tuberculosis (TB) and head CT scans for stroke, which has reduced misdiagnosis in rural areas where radiologists are limited. Similarly, more than 100,000 patients have been screened in rural areas with limited ophthalmologists using Microsoft AI for eye care.
In Pakistan, hospitals need to take the initiatives to integrate AI into patient safety practices. This can begin by conducting a hospital audit to find areas of major risk, for example, maternal/ neonatal complications or medication errors or any other area. This should be followed by exploring the available AI tools based on the prioritised need. For this, a team should be formed of relevant speciality doctors/clinical heads, IT specialists, data officers, and hospital administration to ensure correct technology integration, handling of patient data privacy, integrating AI with any existing systems if required, and budget-related approvals. The team should train the healthcare staff and doctors and pilot it within the relevant department on a small sample for a duration of four to six months, followed by monitoring the outcomes, before expanding to other departments or upscaling it. Through concentrated and consistent efforts, it will be possible to integrate AI for patient safety to facilitate and support both the healthcare workers and the patients.
REFERENCES