0% found this document useful (0 votes)
68 views5 pages

Prabhash Chandra Karan

Uploaded by

singhvinay17595
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
68 views5 pages

Prabhash Chandra Karan

Uploaded by

singhvinay17595
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as DOCX, PDF, TXT or read online on Scribd
You are on page 1/ 5

Prabhash Chandra Karan

Summary:
Principal Gen AI Engineer with overall 20+ Years and 10+ Years of experience in designing, developing, and deploying
AI/ML, and Gen AI solutions using technologies like natural language processing (NLP), deep learning (DL), computer
vision (OpenCV), OpenAI Models (GPT-3.5/4), large language models (LLMs), foundation models (FMs), LangChain,
retrieval augmented generation (RAG); use ML frameworks (TensorFlow, PyTorch, Hugging Face) in multi-cloud
platforms (AWS, GCP, and Azure) and A/B Split testing; design, planning, and deployment of LLM-powered AI/ML
services using Agentic Workflow agent; and structural deployment at scale using containerization tools (Docker and
Kubernetes). Proven ability to lead technical teams, pre-sales solution design with POC development, collaborate with
stockholders and deliver results across enterprise.

Education:
● MS, Computer Science with Distinction, Pace University, New York, USA
● BS, Engineering with First Class, Indian Institute of Engineering, Science and Technology, India

Certifications:
● MIT Designing and Building AI Products and Services 2024
● MIT Quantum Computing and Its Applications 2024
● MIT Quantum Algorithms for Cybersecurity and Optimization 2024
● GCP Certified Professional Machine Learning Engineer 2022
● GCP Certified Professional Data Engineer 2022
● GCP Certified Professional Cloud Database Engineer 2022
● GCP Certified Professional Cloud Security Engineer 2022
● GCP Certified Professional Cloud Network Engineer 2023
● GCP Certified Professional Cloud DevOps Engineer 2022
● GCP Certified Professional Cloud Developer 2022
● GCP Certified Professional Cloud Architect 2022
● GCP Certified Associate Cloud Engineer 2021
● GCP Certified Cloud Digital Leader 2021
● HashiCorp Certified Terraform Associate 2021
● AWS Certified Machine Learning – Specialty 2023
● AWS Certified Data Analytics – Specialty 2023
● AWS Certified Database – Specialty 2023
● AWS Certified Solutions Architect Associate 2020
● AWS Certified SysOps Administrator Associate 2020
● AWS Certified Developer Associate 2020
● AWS Certified Cloud Practitioner 2020
● Oracle Certified EE6 Web Services Developer 2013
● Oracle Certified Expert WebLogic Server Administrator 2008
● Sun Certified Web Component Developer for Java 2 2006
● Sun Certified Programmer for Java 2 Platform 1.4 2005

Experience:
Principal Gen AI Engineer May 2023 – Present
Fannie Mae, Remote
As a visionary and result-oriented Principal Gen AI Engineer, led a highly technical team in designing, developing, and
implementing responsible and accountable use of Gen AI applications using deep learning, neuro-linguistic
programming, computer vision, model fine-tuning, and retrieval augmented generation (RAG) for various Fannie Mae’s
financial and business activities.
● Designed, developed, and deployed AI-powered chatbots (that increased customer satisfaction by 30%) using CCAI,
Dialogflow, Insights, Agent Assist, Agent Desktop, and Generative AI tools in NLP, NLU, Large-Language Models
(LLMs), Foundation Models (FMs), LangChain, AWS Bedrock, Databricks, LLaMA 2, AWS SageMaker, Jupyter
Notebook, and Python.
● Experienced in design, planning, and deployment of LLM-powered AI/ML services using Python, Agentic Workflow
agents, LLMs, LangChain and AWS Bedrock under RAG architecture.
● Designed and deployed Fannie Mae Q&A (document segmentation) model using AI functionalities throughout
lifecycle; pre-sales solution design with POC development and transitioning POC to production-grade deployment
using CI/CD pipeline; used cloud platform and containerization tools like Docker and Kubernetes.
● Harnessed the power of transformer architecture in NLP tasks to optimize model performance, efficiency, and data
processing, empowering language models to achieve high levels of accuracy, performance, reusability, and
adaptability.
● Selected and utilized Fannie Mae's data environment, including data collection, data ingestion, data verification,
and feature engineering using Snowflake to support AI applications.
● Designed, developed, and deployed end-to-end Generative AI products, applications, and solutions for Fannie Mae
business needs, using Vertex AI to integrate prompt-driven AI inputs in chatbot development.
● Monitored and compared various Gen AI models across multiple modalities (text, vision, speech, and audio),
addressing potential data- and concept-accuracy drift, security vulnerabilities, and MLOps support.
● Utilized Explainable AI (XAI) techniques and leveraged feedback, Reinforcement Learning from Human Feedback
(RLHF), cross-check, and performance metrics to ensure the trust and transparency of algorithms at the code level.
● Designed, developed, and implemented AI safety and guardrail APIs for data curation and data mitigation, fact-
checking, source verification, plagiarism detection and prevention, and content moderation.
● Designed and built fault-tolerant distributed training clusters in public cloud infrastructure to support long-running
large-scale ML training processes, and production deployment supporting Service Level Agreements (SLAs) reliably.

AI/ML Architect Feb 2021 – May 2023


USAA, San Antonio, TX
As an AI/ML Architect, collaborated with Machine Learning Engineers, Data Engineers, Data Analysts, and other
specialists to transform business models with AI, solve ML problems, and develop MLOps solutions; designed data
preparation and processing systems, developed, validated, and tested ML models, and explored the alternatives, and
quickly prototyped, automated, and orchestrated ML pipelines.
● Translated business challenges into ML use cases, creating a chatbot using CCAI, Dialogflow, NLP, NLU, and LLM for
interaction between humans and computers.
● Developed a dynamic Q&A context using GPT-3.5 and dynamically converted natural language into accurate
transcriptions and translations using OpenAI Whisper.
● Developed machine learning model for road traffic count using cameras on a real-time video feed using a
combination of computer vision and neural network architectures (CNNs, RNNs, Transformers) and OpenAI vision
model for object detection, image segmentation and scene understanding.
● Created convolutional neural network (CNN)-based model for recognizing genuine or forged signatures on
documents against legal ones using PyTorch and OpenCV libraries.
● Used natural language processing and large language models (LLMs) for building Generative AI using Python, GAN,
WGAN, Pinecone, and LangChain to generate email text accompanied by generating text, images, and artwork using
RAG, DALL-E, and CLIP based on textual prompts and Prompt Engineering using Few-Shot, CoT (Chain of Thought),
and ReAct (Reason Act).
● Migrated company's website business activities to Google Cloud, creating ML models using Vertex AI, PyTorch,
PySpark, Databricks, TensorFlow, BigQueryML, BigTable, CloudSQL, and Cloud Storage; used MLflow for ML
workflow including experiment tracking, model versioning, and deployment.
● Developed an insurance approval model that accepts or rejects insurance applications from potential customers
using MLflow and DagsHub servers as per Google's recommended best practices and conforming to traceability,
reproducibility, and explainability.
● Designed, implemented, and operated a robust and efficient infrastructure that enabled the training, validation,
deployment, and monitoring of machine learning models.
● Implemented end-to-end solutions for batch and real-time algorithms along with requisite tooling around
monitoring, logging, automated testing, model retraining, model deployment, and metadata tracking.

AI/ML Research Engineer Nov 2019 – Mar 2021


Sallie Mae, Indianapolis, IN
As an inspired AI/ML Research Engineer, led the strategy and resolution of highly complex and unique challenges
requiring in-depth evaluation across multiple areas of enterprise and delivered solutions for long-term and large-scale
futuristic visions with creativity, innovation, advanced, and analytical thinking.
● Developed AI/ML models and maintained ML solutions for enterprise business problems using AWS SageMaker and
Azure Learning Studio built-in algorithms for AI-Powered Amazon Contact customer care center.
● Delivered technology platform ML models and software components that solved challenging business-needs in
financial services industry, working in collaboration with the Product, Architecture, Engineering, and Data Science
teams.
● Drove the initiative in platform creation and evolution of ML models and MLOps solutions, creating software in
Azure Learning Studio platform that enabled state-of-the-art intelligent systems.
● Maintained knowledge of industry best practices, know-hows, and new technologies and thereof, recommending
innovations that enhanced operations and provided a competitive advantage to the enterprise, overall.
● Designed and implemented data engineering solutions by creating data repositories for machine learning,
identifying data sources, and determining storage mediums.
● Implemented data ingestion solutions (Snowflake), selecting job styles/types (batch/streaming) and data ingestion
pipelines (ML workloads, Kinesis, Kinesis Analytics, Kinesis Firehose, EMR, and Glue).
● Performed exploratory data analysis for cleaning, sanitizing, and preparing data for modeling, handling missing
data, corrupt data, stop words, etc.
● Implemented machine learning applications and operations for performance, availability, scalability, resiliency, fault
tolerance, and environment logging and monitoring (CloudTrail and CloudWatch).
● Applied basic AWS security practices using IAM, S3 bucket policies, Security Groups (SGs), VPCs for encryption and
creating and exposing endpoints and interacting with them.

AI/ML Solution Architect Apr 2017 – Dec 2019


Freddie Mac, McLean, VA
As an AI/ML Solution Architect, designed, developed, and deployed stable, dynamically scalable, highly available, fault-
tolerant AI/ML applications for home mortgage and banking web applications and migrating them form on-premises
(physical and virtual) legacy system to highly decoupled API-based RESTful microservices in AWS cloud.
● Collaborated with AI/ML application architects, developers, data scientists, and system engineers to capture
functional and non-functional requirements.
● Participated in architecture and design for migration from on-premises legacy system to hybrid AWS cloud
environment, involving application workload and infrastructure re-design.
● Used infrastructure as a code (IaaC) for automation using CloudFormation, Terraform, Ansible, Chef, Puppet, Docker
Containers, ECS, EKS, Fargate, OpenShift, and Kubernetes, for provisioning and managing stacks of resources.
● Experienced in design and development of micro-services architecture, solution evaluation, recommendation, and
implementation of CI/CD pipeline using CodeCommit, CodeDeploy, Hooks, CodePipeline, CodeStar, Jenkins and
Python; used Artifactory JFrog for artifact repository.
● Provisioned EC2 instances, Load Balancers, Auto Scaling Group (ASG), IAM, S3, Lambda functions.
● Networking with Route 53, Direct Connect, Internet Gateway, Web Application Firewall (WAF) that interact with
HTTP, HTTPS, Telnet, SFTP, SSH, Firewall, and VPN.
● CloudFormation scripts for automating AWS stack builds in AWS and use of Service Catalog.
● Used open-source cloud technologies and AWS services including EC2, S3, EFS, RDS, Route 53, Step Functions, SQS,
SNS, CloudWatch, CloudTrail, and X-Ray.
● Participated in microservice architecture for SOAP/RESTful APIs between internal and external service layers using
industry standards and AWS best practices—and at the same time, managing AWS costs.

Infrastructure Architecture Lead Oct 2014 – Apr 2017


IBM, Atlanta, GA
Developed Hartford Insurance Group (HIG) business applications and migrated to a virtual-machine-based architecture
for running WebLogic application servers.
● WebLogic infrastructure architecture and automation solution for migration of 300+ WebLogic Servers form Version
10.3 to 12.1.2 from Unix 5.10 to Linux 8.6 from physical to virtual for 50+ applications in 40+ domains.
● Hands on experience in migration form WebLogic 11g to WebLogic12c and administration on cloud environments.
● Planned and installed of multiple WebLogic, WebSphere, JBoss Application Server instances on brand new or
upgraded existing v-tiers.
● Installed and configured Apache 2.2 HTTP Web Server; installed and configured Apache Tomcat 2.0 Server; installed
and configured SSL certificates in WebLogic.
● Used Shell scripts, WLST scripts, Python scripts, PowerShell scripts for deployment, configuration, and support of
J2EE applications for HIG Insurance Applications.
● Wrote deployment framework for configuring large WebLogic clustered and non-clustered domains; performed
ongoing system monitoring, performance tuning, disaster recovery procedures.
● Planned and implemented disaster recovery and high availability for critical systems.
● Used ServiceNow for Incident Management and to coordinate with Unix team, DB team, Patrol team (monitoring
alerts), Load Balance team and upper management.

Senior WebLogic Administrator Apr 2007 – Oct 2014


AT&T, Atlanta, GA
Plan, develop and deployment of Enterprise Document Processing (like bills, emails) applications to test, QA and
production.
● WebLogic Infrastructure and architectural solution, and installation of multiple WebLogic Server instances on brand
new (VMware) or existing v-tiers residing at single or multiple hosts using WLST, directly or from remote host.
● Used Shell scripts, Python scripts, WLST, and PowerShell scripts for deployment.
● Software development with C, C++, Java, J2EE, and configuration support of 40+ J2EE applications for Enterprise
Document Processing.
● Wrote upgrade plans, automate deployment process (one click upload/configure/undeploy/deploy) for very large-
scale project for 40+ mission critical applications in 400+ WebLogic instances (8.1/9.x to 10.3.x) and 20+ Sun Web
Servers.
● Used ITIL/Tech best practices, and performed ongoing system monitoring, performance tuning, disaster recovery
procedures.
● Installed Single Sign-On (SSO) authentication; installed/upgraded Sun Web Server (iPlanet), Apache 2.2, Introscope,
HP diagnostics and JDK.
● Administered Unix System Administration for Red Hat 2.6 and Sun OS 5.10
● Used Agile process for software development life cycle (SDLC) using Subversion source code control system (SCCS).
● Supported change ticket management systems using CleareQuest and iTrack.
● Prepared technical documents including design document, application document for environment setups, builds,
configurations and deployment.
● Coordinated with Unix team, DB team, Patrol team (monitoring alerts), Load Balance team and upper management
team.
● Actively worked with Oracle support team on WebLogic issues and worked on applying patches and WebLogic
maintenance release (PSU) for WebLogic issues.

J2EE Tech Lead Feb 06 – Nov 06


ACS State Healthcare, Atlanta, GA
Standalone fraud detection software to be used by Federal and State healthcare system.
● Managed technology team from definition phase through implementation.
● Used web architecture, development, and deployment in J2EE with Jakarta Struts 1.1
● Administered life cycle project using Servlet 2.4 and JSP 2.0, JavaScript and core J2EE design patterns Model-View-
Controller, Business Delegate and Transfer Object to run in Tomcat 5.0
● Built JSP pages using EL and Standard/Custom Tag Libraries.
● Used IntelliJ IDEA 5.1 and Eclipse 3.2 as IDEs.
● Performed integration, configuration, development, building, and deployment of EAR, WAR, JAR files using Ant on
test, stage, and production systems using Tomcat 5.0

Sr. Java/J2EE Developer Apr 04 - Feb 06


BellSouth, Atlanta, GA
Design, develop, test, and deployment of business websites for telecommunication industry.
● Gathering business rules, GUI requirement screen-by-screen, designing front-end application flow and page flow,
defining GUI screen, labels, and errors, create definition for Form Beans and Action Mappings, develop Action class,
Manager class, ServiceDespatcher class and deploy and test the application.
● Used Agile software development methodology based on iterative development and collaboration between self-
organizing cross-functional teams.
● Worked as an Architect for eBusiness solution and deployment architecture for a complete life cycle development
of CCT-Web Enablement Project in www.bellsouth.com portal.
● Ensured all interactions with CI services are through synchronous (request reply) and all CI services are
asynchronous with CORBA interface for legacy systems.
● Leveraged the Presentation Service Layer (PSL), Application Service Layer (ASL), and Customer Interface Service
Layer (CISL) architecture and heavily used the WebLogic Server 8.1 SP-2 to take advantage of WebLogic Workshop,
WebLogic Server, and WebLogic Integrator (WLI).
● With Struts framework of WebLogic workshop built the PSL layer: Struts configuration xml files and the application
deployment descriptors are graphically designed and automatically generated by the WebLogic workshop.

Common questions

Powered by AI

He harnessed transformer architectures in NLP tasks to optimize model performance, improve efficiency and adaptability, thereby achieving high levels of accuracy and reuse in language models at Fannie Mae .

To transition POCs to production-grade deployments, Prabhash Chandra Karan employed CI/CD pipelines and containerization tools like Docker and Kubernetes, ensuring the systematic and scalable deployment and management of AI functionalities .

At USAA, Karan transformed business models with AI by designing data preparation and processing systems, orchestrating ML pipelines, and implementing CI/CD automation to create robust MLOps solutions .

Karan designed fault-tolerant distributed training clusters on public cloud infrastructures that supported long-running large-scale ML training processes and reliable production deployment, adhering to service-level agreements (SLAs).

Prabhash Chandra Karan monitored and compared Generative AI models across text, vision, speech, and audio modalities, addressing potential data and concept drift, as well as security vulnerabilities, and implementing MLOps support for sustained efficiency .

Prabhash Chandra Karan was responsible for migrating the company's website business activities to Google Cloud by creating ML models with Vertex AI and other cloud tools, ensuring efficient integration and operation on Google Cloud infrastructure .

Prabhash Chandra Karan designed, developed, and deployed AI-powered chatbots for Fannie Mae using tools like CCAI, Dialogflow, and Generative AI technologies in NLP, which led to a 30% increase in customer satisfaction .

He designed and implemented AI safety and guardrail APIs for data curation and mitigation, fact-checking, source verification, plagiarism detection, and content moderation, thereby enhancing secure data handling .

Prabhash Chandra Karan integrated Explainable AI (XAI) techniques and leveraged Reinforcement Learning from Human Feedback (RLHF), alongside performance metrics, to ensure trust and transparency in AI algorithms at the code level .

He developed an insurance approval model using MLflow and DagsHub servers, conforming to Google's best practices for traceability, reproducibility, and explainability, which were critical for decision-making in insurance applications .

You might also like