Show simple item record

dc.contributor.advisorChang Fernández, Leonardo
dc.contributor.authorÁngeles Cerón, Juan Carlos
dc.creatorChang Fernández, Leonardo; 345979
dc.date.accessioned2022-09-11T02:44:02Z
dc.date.available2022-09-11T02:44:02Z
dc.date.created2021-06-19
dc.date.issued2021-04
dc.identifier.citationÁngeles Cerón, J. (2021). Attention YOLACT++: achieving robust and real-time medical instrument segmentation in endoscopic procedures. (Tesis de Maestría). Tecnológico de Monterrey, Monterrey, Nuevo León, Mexico. Recuperado de: https://hdl.handle.net/11285/648803es_MX
dc.identifier.urihttps://hdl.handle.net/11285/648803
dc.description.abstractImage-based tracking of laparoscopic instruments via instance segmentation plays a fundamental role in computer and robotic-assisted surgeries by aiding surgical navigation and increasing patient safety. Despite its crucial role in minimally invasive surgeries, accurate tracking of surgical instruments is a challenging task to achieve because of two main reasons 1) complex surgical environment, and 2) lack of model designs with both high accuracy and speed. Previous attempts in the field have prioritized robust performance over real-time speed rendering them unfeasible for live clinical applications. In this thesis, we propose the use of attention mechanisms to significantly improve the recognition capabilities of YOLACT++, a lightweight single-stage instance segmentation architecture, which we target at medical instrument segmentation. To further improve the performance of the model, we also investigated the use of custom data augmentation, and anchor optimization via a differential evolution search algorithm. Furthermore, we investigate the effect of multi-scale feature aggregation strategies in the architecture. We perform ablation studies with Convolutional Block Attention and Criss-cross Attention modules at different stages in the network to determine an optimal configuration. Our proposed model CBAM-Full + Aug + Anch drastically outperforms the previous state-of-the art in commonly used robustness metrics in medical segmentation, achieving 0.435 MI_DSC and 0.471 MI_NSD while running at 69 fps, which is more than 12 points more robust in both metrics and 14 times faster than the previous best model. To our knowledge, this is the first work that explicitly focuses on both real-time performance and improved robustness.es_MX
dc.format.mediumTextoes_MX
dc.language.isoenges_MX
dc.publisherInstituto Tecnológico y de Estudios Superiores de Monterreyes_MX
dc.relation.isFormatOfversión publicadaes_MX
dc.relation.isreferencedbyREPOSITORIO NACIONAL CONACYT
dc.rightsopenAccesses_MX
dc.rights.urihttp://creativecommons.org/licenses/by-nc-nd/4.0es_MX
dc.subject.classificationINGENIERÍA Y TECNOLOGÍA::CIENCIAS TECNOLÓGICAS::TECNOLOGÍA MÉDICA::INSTRUMENTOS MÉDICOSes_MX
dc.subject.lcshTechnologyes_MX
dc.titleAttention YOLACT++: achieving robust and real-time medical instrument segmentation in endoscopic procedures.es_MX
dc.typeTesis de Maestría / master Thesises_MX
dc.contributor.departmentEscuela de Ingeniería y Cienciases_MX
dc.contributor.committeememberGonzález Mendoza, Miguel
dc.contributor.committeememberAlí, Sharib
dc.contributor.mentorOchoa Ruiz, Gilberto
dc.identifier.orcidhttps://orcid.org/0000-0002-9483-4422es_MX
dc.subject.keywordEndoscopyes_MX
dc.subject.keywordDeep Learninges_MX
dc.subject.keywordAttentiones_MX
dc.subject.keywordInstance segmentationes_MX
dc.subject.keywordMedicales_MX
dc.contributor.institutionCampus Monterreyes_MX
dc.contributor.catalogeremipsanchezes_MX
dc.description.degreeMaestro en Ciencias Computacionaleses_MX
dc.audience.educationlevelInvestigadores/Researcherses_MX
dc.identificator7||33||3314||331110es_MX


Files in this item

Thumbnail
Thumbnail
Thumbnail

This item appears in the following Collection(s)

Show simple item record

openAccess
Except where otherwise noted, this item's license is described as openAccess