Calendar

September 2018
M T W T F S S
« Aug    
 12
3456789
10111213141516
17181920212223
24252627282930

Calendar

September 2018
M T W T F S S
« Aug    
 12
3456789
10111213141516
17181920212223
24252627282930

Follow us on…

  • {name}
  • Google Profile
  • LinkedIn
  • Twitter
  • Facebook
  • YouTube
  • slideshare

Facebook

Recent videos

Posted on: September 17-2018 | By : Sagar Mahajan | In: Life Sciences,Pharmaceuticals and Biotech | 1 Comment

Recent posts

    Traditionally, a trial master file (TMF) has been thought of as a collection of documents to be collated to satisfy the regulatory requirements of a clinical study. While this is still the case, today’s increasing use of electronic TMFs and integrated clinical systems is giving the industry a more holistic and real-time view of clinical trial activities.


    As such, regulatory bodies now expect a TMF to be the story of “how the trial was conducted and managed.”


    Migrating eTMF documents within systems is not an easy task. Migration is much more than mapping document types and artifacts, and harmonizing the metadata.


    DIA is proposing a universal mechanism for document transfer, which is referred to as eTMF Exchange Mechanism Standard (EMS). The preliminary version was released by DIA at its global meeting in Boston in June 2018, so let’s take a look at the background and specifications of this important development.


    The new EMS standard will become a major milestone, helping resolve major issues the industry faces when trying to migrate from one platform to another. It will also facilitate the exchange of eTMF content between CRO organizations and from one system to another. The standard includes a specification document, Exchange.xml and Exchange.xsd. (schema for validation of XML format). The schema for exchange is currently in the development phase.


    The model specifies different tags required to encapsulate files and metadata values, individual objects, artifact files, electronic signature displays on artifact files, and audit trail entries for artifacts being exchanged.


    The eTMF-EMS process operates between two different systems. The originating system produces the output of artifacts within a predefined folder structure with an Exchange.xml file, which further undergoes validation with the help of the Exchange.xsd schema. When the target system imports the Exchange.xml file, it performs the same validation checks, imports all artifacts, and files them under relevant TMF artifact number within the system.


    When it comes to implementing EMS, there are a few potential scenarios:


    • Study or Contract Closure: Final eTMF transfer to sponsor from CRO for archiving
    • Merger and Acquisition: Migration of eTMF content
    • Platform Change or System Upgrade: Migration eTMF documents


    Today, companies understand the value of compliant and inspection-ready TMF documentation throughout the clinical development lifecycle — and it has become critical to keep these documents ready as and when required for smoother market access.


    In the next installment, we will shed a little more light on EMS and discuss a few interesting use cases and best practices for implementing EMS within your organization. Stay tuned.


    Author
    Sagar Mahajan

    My Page My Page

    Sagar Mahajan
    Sagar is having 10+ years of rich experience in delivering projects pertaining to Clinical...

     
    Posted on: August 23-2018 | By : Gauri Deshmukh | In: Clinical data management,Industries,Life Sciences,Pharmaceuticals and Biotech,SEND Services | 1 Comment

    The pharmaceutical industry is undergoing a huge transformation. The traditional ways of clinical research will soon be a thing of the past, with disruptive technologies like machine learning (ML), artificial intelligence (AI), natural language processing (NLP) and robotic process automation (RPA) moving from niche positions to sweep across the pre-clinical and clinical research value chains


    With new FDA requirements that took effect in December, 2016, you must be able to receive and submit non-clinical toxicology study data in the SEND format. The new SEND standards have a major impact on the industry, as they are now binding for all sponsors submitting non-clinical data to the FDA — re-structuring the way non-clinical data is collected and submitted.


    Long-running legacy processes and a lack of standards have led to inconsistent data standards that differ from company to company. Moreover, there may be inconsistency at many levels within a single organization, depending on the enforcement of these standards.


    Deadlines


    The major challenges that the industry is facing with SEND implementation are:

    Lack of resources with deep SEND and study-specific expertise

    • Deep standards knowledge is required for trial design creation and Study Data Reviewer’s Guide (SDRG) content analysis and authoring

    Tools for viewing study data and cross-study analytics

    • Increasing need for data exploration
    • Deployment, integration, advanced visualization and support for the selected system(s)

    Global harmonization of SEND processes after mergers & acquisitions

    Time-consuming and costly manual data QA checks

    • Existing products do not provide certain automated data checks
    • Many companies are using older Laboratory Information Management Systems (LIMS), which do not support SEND-compliant data collection

    Data merging issues

    • Use of multiple LIMS systems
    • Multiple source files from CROs, in-house labs, etc.

    Implementing SEND standards requires deep knowledge and expertise in understanding, planning, implementing and submitting non-clinical data to regulatory bodies. Small errors complying with SEND standards may result in a rejection, making it imperative that your submissions are prepared by experienced and knowledgeable subject matter experts.

    We are here to listen. At Syntel, we understand the core problem, and put our experience to work addressing this challenge with a two pronged approach.

    • Ready pool of SEND SMEs. Implementing SEND standards is a niche skill, and it can be difficult to identify and hire the right team members. Syntel has a comprehensive process to find the right talent for your organization, including a n extensive screening process, SEND-specific training, an ongoing competency building approach, and a robust knowledge management framework.
    • Quality assurance (QA): QA demands 100% adherence to SEND standards and processes, along with data consistency. We ensure that your SEND data is consistent with the original LIMS data resource, and conforms to the FDA’s implementation of the CDISC standard.


    Syntel offers a full range of SEND operational and data management services, with a pool of trained, talented team members ready to deliver high-quality service based on SEND and QA processes, established information security controls, and a scalable model ready to help you build a world-class SEND organization.


    If you are ready to put your SEND compliance initiatives on the right track, you need to talk to one of our experts today and learn how Syntel can deliver the mix of skills and quality that you’ve been looking for.


    Contact us at Syntel_lifesciences@syntelinc.com today.

    My Page My Page

    Gauri Deshmukh
    Gauri Deshmukh is a Senior Consultant – Life Sciences at Syntel, providing Domain Solutions to...

     
    Posted on: August 16-2018 | By : Shailesh Gadhave | In: Digital Age,Digital Solutions,Life Sciences,Pharmaceuticals and Biotech | No Comments

    Recent posts

      The life sciences industry today is undergoing a wave of disruption, with biopharmaceutical companies embracing potential new ways to deliver a value-based model over volume by developing specialty products in target segments.


      This wave of change is due to the fact that science and business operations are becoming increasingly complex, competitive and global — with companies attempting to make a comprehensive impact in this highly competitive marketplace. With increasingly complex development processes and soaring research and development (R&D) investments, stakeholders (business, end users, investigators, etc.) are looking for a more robust, reliable and reproducible approach to bringing personalized products to the market.


      Even though the digital technology exists today to optimize the entire R&D value chain, adoption is low — especially in clinical trials segment — because of factors like complexity, resource-intensiveness, changing regulatory dynamics and lengthy implementations. However, while these factors may seem like an argument against implementation, the rewards are worth the effort.


      As industry leaders move towards adopting personalized medicine and a patient-centric approach, it is increasingly important for clinical development enterprises to gain access to the growing volumes of patient historical data, real-world evidence, genomic profiles and emerging research to meet sponsor expectations. Harnessing this data can help investigators get a 360º view of patient performance and demonstrate the true value of new treatments to key stakeholders for effective market access.


      Digital technologies can transform how companies approach clinical trial management, by enabling them to access a wealth of information from different data sources, improve patient enrollment and trial experience, capture real-time data insights and improve the quality of data collected during trials. Collectively, this can help achieve the following clinical objectives:


      • Expedite patient enrollment and retention by mining unstructured patient health data and identifying the right patient-trial match

      • Enhance patient trial experience to transform subject onboarding, trial understanding and helping set realistic expectations

      • Build an integrated trial management platform that can capture, integrate and analyze complex data during trials

      • Manage / regulate multiple sites and trace performance using advanced analytics and visualizations, enabling early intervention or shutdown of non-performing sites

      • Effectively manage stakeholder expectations and performance delivery


      In our opinion, adopting digital technologies is an imperative strategy for clinical development enterprises. Emerging technologies like artificial intelligence, machine learning and advanced analytics are empowering CROs to reduce cost, integrate trial management, establish a tighter control, enable smart manual intervention, deliver quality outcomes and reproduce results in multiple scenarios.


      In upcoming posts, we will further explore this promising synergy between technology and science to advance innovation in medicine. Stay tuned!



      My Page My Page

      Shailesh Gadhave
      Shailesh is a post-graduate candidate in Marketing-Life Sciences with passion for developing novel...

       
      Posted on: March 26-2018 | By : Sourav Gupta | In: Industries,Life Sciences | No Comments

      Recent posts

        The journey so far


        Year 2017 was a learning phase for the Medical Device industry. With the official launch of the European Medical Device Regulations (EU MDR 2017/745) and European In-Vitro Diagnostic Regulations (EU IVDR 2017/746), many medical device and in-vitro diagnostic companies faced uncertainty and confusion over the new requirements, as well as risks and challenges to their business.


        With few exceptions, I have seen most large manufacturers establish a program management (PMO) team and roping in top management consulting companies to help determine the impact on their business and revenues. Small and mid-size companies are working with regulatory consultants to understand the new requirements.


        The second half of the last year was mostly spent on awareness sessions, workshops, gap assessment pilots and budget planning. I have spoken to many program directors to understand their strategy, budget and plan for the next two to three years, and led or participated in many knowledge sessions and workgroups. I have learned a great deal so far, and hope to share some of that with you today.


        Key Lessons


        1. Simplify program management by creating a 360° view dashboard for each product family. This dashboard should include parameters like revenue, markets, risks, technical files, labels/IFUs, authorized representative, notified body, manufacturing sites, economic operators, QMS, etc., along with your key stakeholders from various functions.

        2. Remove redundancy and organize technical files and design dossiers for each product family in a centralized repository.

        3. Digitize high-value legacy records and technical documents for gap assessment, inspection readiness and easy availability.

        4. Plan ahead for your organization-wide implementation or upgrade to QMS ISO EN 13485:2016, MEDDEV 2.7.1 rev.4, UDI, eIFU Website (Content Management Framework).

        5. Prioritize the remediation of technical files for products with high business impact and complexity.

        6. Evaluate and onboard service providers early in the program to improve the success rate and on-time completion.

        7. Build an agile platform that enables internal and external stakeholders to engage and collaborate on a day-to-day basis.

        8. Implement tools and accelerators to improve visibility, tracking, quality and productivity across workstream projects and your overall program.


        Key Lessons


        While the industry is waiting for the re-designation of the Notified Bodies, most manufacturers have completed the impact and gap assessment of key technical files and are currently planning for technical file remediation. Keep in mind that if you have a high volume of technical documents, or they are complex, in legacy formats, or not well-controlled or documented under the same product family, remediation will require much more advance planning.


        Key considerations for technical file remediation


        Planning
        • Create/ update SOPs

        • Prepare quality plans for technical files and design history files (DHF)

        • Plan cross-functional work streams for document remediation

        • Identify critical success factors, risks and challenges

        • Develop a detailed checklist to review technical files

        Reformatting GHTF
        Summary Technical
        Documentation (STED)
        • Update formats (GHTF/SG1/N011:2008 for medical devices; GHTF/SG1/N063:2011 for in-vitro diagnostic devices) to align with the new requirements

        • Format documents into paginated and fully searchable PDF files

        • Devise a logical numbering for files (e.g. Part 1 of x, Part 2 of x… Part x of x)

        • Bookmark GHTF STED sections with clear document references

        • Write technical files in an official language of the member state where procedures are carried out, or the language accepted by the Notified Body. English is recommended for all audit-related documents.

        • Use digital signatures or scanned signature pages where signatures are required

        • Make the technical documentation a pointer document

        General Safety
        and Performance Requirements
        Checklist
        • Update the Essential Requirement Checklist (ERC) and map to the new requirements

        • Mention relevant standards “State of the Art” assessments (fully or partially applied)

        • Provide reference to harmonized standards and justification if not applicable

        • Look for objective evidence to support conformity, linking and bookmarks to relevant documents

        Declaration of Conformity
        • Ensure the product list in the technical documentation matches the DOC

        • Sign a new Declaration of Conformity


        Conclusion


        It requires extremely careful planning and monitoring of different workstreams to successfully remediate technical files. By identifying critical success factors, risks and challenges early on, your planning will be easier and the chances of a “first-time-right” remediation program improve dramatically. By setting-up a digital PMO dashboard, QC checklists, tools and accelerators, you can help ensure a much smoother transition to EU MDR/ IVDR compliance.


        Author
        Sourav Gupta

        My Page My Page

        Sourav Gupta
        Sourav Gupta has 14+ years of experience supporting the Life Sciences industry in the clinical,...

         
        Posted on: March 12-2018 | By : Imran Sheikh | In: Life Sciences | 1 Comment

        Recent posts

          Although high-profile data breaches grab the headlines, behind the scenes, life sciences is just as information sensitive as any industry. Any pharma company faces a major challenge when it comes to complying with the myriad regulatory requirements about how they create, handle and submit documentation about the products they manufacture.


          Even a minor slip up can have a serious financial impact in the form of fines, penalties, or delayed product launches. With so much at stake, document management is a critical process for any pharma companies.


          There are a number of widely-accepted best practices to ensure compliance with regulations, but many of these are extremely complex to implement, meaning that the industry has struggled for years to put them in place. These include:


          • Harmonizing and standardizing all global regulatory submission templates
          • Keeping on top of changing and emerging requirements and timelines
          • Implementing strong versioning and change management practices
          • Maintaining strong governance and process management controls
          • Capturing key performance metrics and feedback from regulatory authorities


          At Syntel, our prescription to these challenges is a suite of end-to-end regulatory services, driven by an intelligent regulatory services platform. We combine the power of automation and decades of business process expertise to implement standardized, trusted processes for regulatory operations and submissions to health authorities. We can deliver business outcomes such as:


          • 20% reduction in regulatory writing and submission publishing time
          • Dramatic improvements in submission quality
          • Consistent templates for all clinical documents
          • Easier document lifecycle management


          A huge added benefit to more structured and consistent document management is the ability to capture rich metrics at every step in the process. By making the document initiation, preparation and publication process more transparent, we can deliver deep insights into your operations and make continuous process improvement a reality.


          If you are ready to take the next step in your document management operations, visit us online at https://www.syntelinc.com/industries/life-sciences.


          Author
          Imran Sheikh

          My Page My Page

          Imran Sheikh
          Imran Sheikh is Sr. Consultant, Regulatory Affairs at Syntel. He has over 10 years of rich...

           
           

          Recent Posts

          © 2018 Syntel, Inc.