Acerca de
Artículos de Leonardo
Contribuciones
Actividad
-
El director de la Agencia Nacional de Ciberseguridad (ANCI), Daniel Álvarez, abre la primera jornada de #PatagoniaCiber2025 explicando porqué la…
El director de la Agencia Nacional de Ciberseguridad (ANCI), Daniel Álvarez, abre la primera jornada de #PatagoniaCiber2025 explicando porqué la…
Recomendado por Leonardo Ferreira
-
¡Saludos desde Puerto Varas! Ya estamos por comenzar #PatagoniaCiber2025, gran convención de ciberseguridad organizada por la Agencia Nacional de…
¡Saludos desde Puerto Varas! Ya estamos por comenzar #PatagoniaCiber2025, gran convención de ciberseguridad organizada por la Agencia Nacional de…
Recomendado por Leonardo Ferreira
-
Nesta quinta-feira (22/05), pude participar do primeiro CTI Workshop, realizado pelo CISSA, o Centro de Competência em Cibersegurança do CESAR. Pude…
Nesta quinta-feira (22/05), pude participar do primeiro CTI Workshop, realizado pelo CISSA, o Centro de Competência em Cibersegurança do CESAR. Pude…
Recomendado por Leonardo Ferreira
Experiencia y educación
Licencias y certificaciones
Experiencia de voluntariado
-
Programa de Conselho Consultivo (Advisory Board) da Conferência Gartner de Segurança & Gestão de Risco - Brasil 2025
Gartner
- actualidad 8 mes
Ciencia y tecnología
O Conselho Consultivo tem como objetivo obter s de profissionais sobre o conteúdo e o formato desse programa, a fim de que continuemos a produzir Conferências relevantes, diversas e inclusivas para os nossos participantes.
Como membro do Conselho Consultivo, você desempenhará um papel fundamental no processo de planejamento, contribuindo em temas relacionados às Conferências Gartner que irão impactar os diferentes setores e indústrias.
A sua visão sobre tendências…O Conselho Consultivo tem como objetivo obter s de profissionais sobre o conteúdo e o formato desse programa, a fim de que continuemos a produzir Conferências relevantes, diversas e inclusivas para os nossos participantes.
Como membro do Conselho Consultivo, você desempenhará um papel fundamental no processo de planejamento, contribuindo em temas relacionados às Conferências Gartner que irão impactar os diferentes setores e indústrias.
A sua visão sobre tendências, motivadores e prioridades que moldam as indústrias é muito valiosa para que possamos explorar como a Conferência abordará esses temas de forma impactante e relevante.
Publicaciones
-
MongoDB: analysis of performance with data from the National High School Exam (Enem).
16th Iberian Conference on Information Systems and Technologies (CISTI)
It is increasingly common for public and private organizations to deal with the challenges of data management in Big Data environments, tipycally characterized by high volume, great variety, fast generation velocity, in addition to intrinsic issues of data veracity and value. This context gave rise to NoSQL database technologies, with MongoDB as one of its prominent representatives. This paper aims to evaluate the performance of MongoDB in its cluster architecture against the standalone mode…
It is increasingly common for public and private organizations to deal with the challenges of data management in Big Data environments, tipycally characterized by high volume, great variety, fast generation velocity, in addition to intrinsic issues of data veracity and value. This context gave rise to NoSQL database technologies, with MongoDB as one of its prominent representatives. This paper aims to evaluate the performance of MongoDB in its cluster architecture against the standalone mode, with massive data from the National High School Exam (Enem), in addition to performing a brief interpretation of the results from the perspective of the educational domain. The initial results of query, update, and delete operations demonstrate a set of advantages and disadvantages of the experiments' architectures. The Information Technology (IT) professional's role in defining the scope of the database project for a more bold choice of the NoSQL architecture is highlighted in a cost-benefit perspective for organizations.
-
Data Governance in the big date context: a bibliometric study (USP).
16th International Conference on Information Systems and Technology Management (CONTECSI)
It is an exploratory bibliometric study that sought to identify the main areas of knowledge that approach the subjects studied, the behavior of the conceptual structure of this field of research, the main theoretical approaches, the main authors, the main research fronts, as well as the key challenges in implementing Data Governance policy in the Big Data environment. Data were collected from the Web of Science databases, which returned 50 articles, and the Scopus database, which resulted in…
It is an exploratory bibliometric study that sought to identify the main areas of knowledge that approach the subjects studied, the behavior of the conceptual structure of this field of research, the main theoretical approaches, the main authors, the main research fronts, as well as the key challenges in implementing Data Governance policy in the Big Data environment. Data were collected from the Web of Science databases, which returned 50 articles, and the Scopus database, which resulted in 109 space-time articles between 2012 and 2019. Data were extracted and exported to VOSviewer software for bibliometric analysis of type co-word, co-citation and bibliographic coupling. The results highlight the supremacy of the United States and China in the issue in of quantitative publications, since the main area appears in computer science, especially in the field of information systems. The technologies of machine learning, artificial intelligence, internet of things are addressed in the main studies. Finally, a visual model is presented that integrates the main results between Data Governance, Big Data and Information Systems.
Cursos
-
Deep Learning Specialization / deeplearning.ai / Andrew Ng / 2020
RUEFM3D757L9
-
AI For Everyone / deeplearning.ai / Andrew Ng / 2020
6K9CR6WEVEZ8
-
Computational Social Science Methods / University of Califórnia, Davis / Martin Hilbert /2020
PMRGPWM3CJC4
-
Convolutional Neural Networks / deeplearning.ai / Andrew Ng / 2020
5AK85LYY7T2D
-
Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization / deeplearning.ai / Andrew Ng / 2020
NWNPLJAXTXKU
-
Machine Learning for Business Professionals / Google Cloud / 2020
4TL5Q2N2JHTY
-
Neural Networks and Deep Learning / deeplearning.ai / Andrew Ng / 2020
NSXMA7NXUTCD
-
Sequence Models/ deeplearning.ai / Andrew Ng / 2020
3LMS9NZAKT9N
-
Structuring Machine Learning Projects / deeplearning.ai / Andrew Ng / 2020
LYG86QWV4YAH
Proyectos
-
Computational Social Science Methods: Web Scraping Lab.
- actualidad
The final project of the Computational Methods in Social Sciences course with the following objective: You will web scrape two different YouTube channels using: http://webscraper.io with the objectives: Take at least 5 screenshots of different stages of your web scraping progress. all 5+ images. Screenshots (taken on Mac or Windows ) will be in image format (extensions like .JPG, .JPEG, .PNG, .TIFF, .GIF, .BMP, .PDF, etc), the spreadsheet file for your first YouTube channel…
The final project of the Computational Methods in Social Sciences course with the following objective: You will web scrape two different YouTube channels using: http://webscraper.io with the objectives: Take at least 5 screenshots of different stages of your web scraping progress. all 5+ images. Screenshots (taken on Mac or Windows ) will be in image format (extensions like .JPG, .JPEG, .PNG, .TIFF, .GIF, .BMP, .PDF, etc), the spreadsheet file for your first YouTube channel. Spreadsheets will have extensions like .csv (most generic), .xls, .xlsx, .numbers, etc e the spreadsheet file for your second YouTube channel.
-
Deep Learning: Sequence Models. Building your Recurrent Neural Network - Step by Step.
- actualidad
Conclusion project for week 1 of the Seguence Models course of the specialization in Deep Learning. Use of jupyter notebook with numpy, rnn, among others with the objective: implement key components of a Recurrent Neural Network in numpy. Recurrent Neural Networks (RNN) are very effective for Natural Language Processing and other sequence tasks because they have "memory".
-
Deep Learning: Sequence Models. Character level language model - Dinosaurus Island.
- actualidad
Conclusion project for week 1 of the Seguence Models course of the specialization in Deep Learning. Use of jupyter notebook with numpy, rnn, utils among others with the objective: How to store text data for processing using an RNN, How to synthesize data, by sampling predictions at each time step and ing it to the next RNN-cell unit, How to build a character-level text generation recurrent neural network e Why clipping the gradients is important.
-
Deep Learning: Sequence Models. Emojify.
- actualidad
Conclusion project for week 2 of the Sequence Models course of the specialization in Deep Learning. Use of jupyter notebook with Numpy, TensorFlow, emoji packages, among others with the objective: In this project, you'll start with a baseline model (Emojifier-V1) using word embeddings e Then you will build a more sophisticated model (Emojifier-V2) that further incorporates an LSTM.
-
Deep Learning: Sequence Models. Jazz improvisation with LSTM.
- actualidad
Conclusion project for week 1 of the Sequence Models course of the specialization in Deep Learning. Use of jupyter notebook with numpy, music21, grammar, rnn, music_utils, keras, among others with the objective: Apply an LSTM to music generation e Generate your jazz music with deep learning.
-
Deep Learning: Sequence Models. Neural Machine Translation with Attention.
- actualidad
Conclusion project for week 3 of the Sequence Models course of the specialization in Deep Learning. Use of jupyter notebook with Numpy, Keras, faker , babel.dates packages, among others with the objective: You will build a Neural Machine Translation (NMT) model to translate human-readable dates ("25th of June, 2009") into machine-readable dates ("2009-06-25") e You will do this using an attention model, one of the most sophisticated sequence-to-sequence models.
-
Deep Learning: Sequence Models. Operations on word vectors - Debiasing.
- actualidad
Conclusion project for week 2 of the Sequence Models course of the specialization in Deep Learning. Use of jupyter notebook with Numpy, TensorFlow packages, among others with the objective: Load pre-trained word vectors, and measure similarity using cosine similarity, Use word embeddings to solve word analogy problems such as Man is to Woman as King is to __ e Modify word embeddings to reduce their gender bias.
-
Deep Learning: Sequence Models. Trigger word detection (Speech Recognition project).
- actualidad
Conclusion project for week 3 of the Sequence Models course of the specialization in Deep Learning. Use of jupyter notebook with Numpy, Pydub , io, glob packages, among others with the objective: Structure a speech recognition project, Synthesize and process audio recordings to create train/dev datasets e Train a trigger word detection model and make predictions.
-
Deep Learning: Building your Deep Neural Network: Step by Step.
- actualidad
Conclusion project for week 4 of the Neural Networks and Deep Learning course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, h5py packages, among others with the objective: Develop an intuition of the over all structure of a neural network, Write functions (e.g. forward propagation, backward propagation, logistic loss, etc...) that would help you decompose your code and ease the process of building a neural network e Initialize/update parameters…
Conclusion project for week 4 of the Neural Networks and Deep Learning course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, h5py packages, among others with the objective: Develop an intuition of the over all structure of a neural network, Write functions (e.g. forward propagation, backward propagation, logistic loss, etc...) that would help you decompose your code and ease the process of building a neural network e Initialize/update parameters according to your desired structure.
-
Deep Learning: Convolucional Neural Networks.
- actualidad
Conclusion project for week 2 of the Convolutional Neural Networks course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, sklearn, init_utils(sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: Learn to use Keras, a high-level neural networks API (programming framework), written in Python and capable of running on…
Conclusion project for week 2 of the Convolutional Neural Networks course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, sklearn, init_utils(sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: Learn to use Keras, a high-level neural networks API (programming framework), written in Python and capable of running on top of several lower-level frameworks including TensorFlow and CNTK for built Emotion Detection in Images of Faces.
-
Deep Learning: Convolutional Neural Networks.
- actualidad
Conclusion project for week 1 of the Convolutional Neural Networks course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, sklearn, init_utils(sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: implement convolutional (CONV) and pooling (POOL) layers in numpy, including both forward propagation and backward…
Conclusion project for week 1 of the Convolutional Neural Networks course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, sklearn, init_utils(sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: implement convolutional (CONV) and pooling (POOL) layers in numpy, including both forward propagation and backward propagation.
-
Deep Learning: Convolutional Neural Networks.
- actualidad
Conclusion project for week 1 of the Convolutional Neural Networks course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, sklearn, init_utils(sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: Implement helper functions that you will use when implementing a TensorFlow model e Implement a fully functioning ConvNet…
Conclusion project for week 1 of the Convolutional Neural Networks course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, sklearn, init_utils(sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: Implement helper functions that you will use when implementing a TensorFlow model e Implement a fully functioning ConvNet using TensorFlow.
-
Deep Learning: Convolutional Neural Networks.
- actualidad
Conclusion project for week 2 of the Convolutional Neural Networks course of the specialization in Deep Learning. Use of jupyter notebook with keras, numpy, matplotlib, sklearn, init_utils(sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: Implement the basic building blocks of ResNets,Put together these building blocks to implement and train a…
Conclusion project for week 2 of the Convolutional Neural Networks course of the specialization in Deep Learning. Use of jupyter notebook with keras, numpy, matplotlib, sklearn, init_utils(sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: Implement the basic building blocks of ResNets,Put together these building blocks to implement and train a state-of-the-art neural network for image classification.
-
Deep Learning: Convolutional Neural Networks.
- actualidad
Conclusion project for week 3 of the Convolutional Neural Networks course of the specialization in Deep Learning. Use of jupyter notebook with keras, tensorflow, numpy, matplotlib, argparse, pandas, PIL, packages, among others with the objective: learn about object detection using the very powerful YOLO model, Use object detection on a car detection dataset e Deal with bounding boxes.
-
Deep Learning: Convolutional Neural Networks.
- actualidad
Conclusion project for week 4 of the Convolutional Neural Networks course of the specialization in Deep Learning. Use of jupyter notebook with tensorflow, numpy, matplotlib, PIL, packages, among others with the objective: Implement the neural style transfer algorithm e generate novel artistic images using your algorithm for built an Art generation with Neural Style Transfer,
-
Deep Learning: Convolutional Neural Networks.
- actualidad
Conclusion project for week 4 of the Convolutional Neural Networks course of the specialization in Deep Learning. Use of jupyter notebook with keras, tensorflow, numpy, matplotlib packages, among others with the objective: Implement the triplet loss function, Use a pretrained model to map face images into 128-dimensional encodings. Use these encodings to perform face verification and face recognition for built an Face Recognition.
-
Deep Learning: Deep Neural Network - Application.
- actualidad
Conclusion project for week 4 of the Neural Networks and Deep Learning course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, h5py, time, PIL, scipy packages, among others with the objective: Learn how to use all the helper functions you built in the previous assignment to build a model of any structure you want, Experiment with different model architectures and see how each one behaves, Recognize that it is always easier to build your helper functions…
Conclusion project for week 4 of the Neural Networks and Deep Learning course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, h5py, time, PIL, scipy packages, among others with the objective: Learn how to use all the helper functions you built in the previous assignment to build a model of any structure you want, Experiment with different model architectures and see how each one behaves, Recognize that it is always easier to build your helper functions before attempting to build a neural network from scratch.
-
Deep Learning: Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization.
- actualidad
Conclusion project for week 1 of the Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, sklearn, init_utils(sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: Understand that different initialization methods and their impact…
Conclusion project for week 1 of the Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, sklearn, init_utils(sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: Understand that different initialization methods and their impact on your model performance, Implement zero initialization and and see it fails to "break symmetry", Recognize that random initialization "breaks symmetry" and yields more efficient models, Understand that you could use both random initialization and scaling to get even better training performance on your model.
-
Deep Learning: Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization.
- actualidad
Conclusion project for week 1 of the Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, sklearn, init_utils(sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: Understand that different regularization methods that could help…
Conclusion project for week 1 of the Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, sklearn, init_utils(sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: Understand that different regularization methods that could help your model,Implement dropout and see it work on data, Recognize that a model without regularization gives you a better accuracy on the training set but nor necessarily on the test set e Understand that you could use both dropout and regularization on your model.
-
Deep Learning: Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization.
- actualidad
Conclusion project for week 1 of the Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, sklearn, init_utils(sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: Implement gradient checking from scratch, Understand how to use…
Conclusion project for week 1 of the Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, sklearn, init_utils(sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: Implement gradient checking from scratch, Understand how to use the difference formula to check your backpropagation implementation, Recognize that your backpropagation algorithm should give you similar results as the ones you got by computing the difference formula e Learn how to identify which parameter's gradient was computed incorrectly.
-
Deep Learning: Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization.
- actualidad
Conclusion project for week 1 of the Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, sklearn, init_utils(sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: Implement gradient checking from scratch, Understand how to use…
Conclusion project for week 1 of the Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, sklearn, init_utils(sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: Implement gradient checking from scratch, Understand how to use the difference formula to check your backpropagation implementation, Recognize that your backpropagation algorithm should give you similar results as the ones you got by computing the difference formula e Learn how to identify which parameter's gradient was computed incorrectly.
-
Deep Learning: Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization.
- actualidad
Conclusion project for week 2 of the Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, sklearn, init_utils (sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: Understand the intuition between Adam and RMS prop, Recognize the…
Conclusion project for week 2 of the Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, sklearn, init_utils (sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: Understand the intuition between Adam and RMS prop, Recognize the importance of mini-batch gradient descent e Learn the effects of momentum on the overall performance of your model.
-
Deep Learning: Logistic Regression with a Neural Network mindset.
- actualidad
Conclusion project for week 2 of the Neural Networks and Deep Learning course of the specialization in Deep Learning. Use of jupyter notebook with numpy, matplotlib, h5py packages, among others with the objective: build the general architecture of a learning algorithm, including: initializing parameters, calculating the cost function and its gradient. Using an optimization algorithm (gradient descent).
-
Deep Learning: Logistic Regression with a Neural Network mindset.
- actualidad
Conclusion project for week 3 of the Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course of the specialization in Deep Learning. Use of jupyter notebook with tensorflow, numpy, matplotlib, sklearn, init_utils (sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: After completing this assignment you will also be…
Conclusion project for week 3 of the Improving Deep Neural Networks: Hyperparameter tuning, Regularization and Optimization course of the specialization in Deep Learning. Use of jupyter notebook with tensorflow, numpy, matplotlib, sklearn, init_utils (sigmoid, relu, compute_loss, forward_propagation, backward_propagation, update_parameters, predict, load_dataset, plot_decision_boundary, predict_dec) packages, among others with the objective: After completing this assignment you will also be able to implement your own deep learning models using Tensorflow. In fact, using our brand new SIGNS dataset, you will build a deep neural network model to recognize numbers from 0 to 5 in sign language with a pretty impressive accuracy.
-
Deep Learning: Planar data classification with a hidden layer.
- actualidad
Conclusion project for week 3 of the Neural Networks and Deep Learning course of the specialization in Deep Learning. Use of jupyter notebook with sklearn, numpy, matplotlib, packages, among others with the objective: Develop an intuition of back-propagation and see it work on data, Recognize that the more hidden layers you have the more complex structure you could capture e Build all the helper functions to implement a full model with one hidden layer.
-
Deep Learning: Structuring Machine Learning Projects.
- actualidad
1. Machine Learning flight simulator.
2. Bird recognition in the city of Peacetopia (case study).
3. Autonomous driving (case study). -
Data Science: Criação do Núcleo de Ciência de Dados do CNMP
- actualidad
Foi criado, por meio da Portaria CNMP-PRESI nº 63, o Núcleo de Ciência de Dados, que faz parte da Secretaria de Gestão Estratégica. O núcleo foi criado no intuito de assessorar as unidades do Conselho nas atividades que envolvam coleta, organização e análise de dados; na elaboração de estudos para orientar discussões e subsidiar o processo de tomada de decisões estratégicas do CNMP; no desenvolvimento de pesquisas e diagnósticos sobre o Ministério Público, em suas diversas áreas temáticas, além…
Foi criado, por meio da Portaria CNMP-PRESI nº 63, o Núcleo de Ciência de Dados, que faz parte da Secretaria de Gestão Estratégica. O núcleo foi criado no intuito de assessorar as unidades do Conselho nas atividades que envolvam coleta, organização e análise de dados; na elaboração de estudos para orientar discussões e subsidiar o processo de tomada de decisões estratégicas do CNMP; no desenvolvimento de pesquisas e diagnósticos sobre o Ministério Público, em suas diversas áreas temáticas, além de disseminar a cultura de dados no âmbito do CNMP.
-
Data Science: Analisando os Dados da Airbnb na cidade do Rio de Janeiro.
-
Análise Exploratória de Dados conduzida sobre os dados da operação da AirBnb na cidade do Rio de Janeiro. Foi possível ter algumas insights interessantes no projeto.
-
Data Science : Laboratório Inteligência do Gasto Público - Secretaria de Economia do DF
-
O Laboratório de Inteligência do Gasto Público - LIGP tem como objetivos no âmbito do ciclo de aquisições governamentais: I - Contribuir para melhoria da qualidade do gasto público; II - Elaborar estudos ligados à temática da qualidade do gasto público; III - Propor medidas de racionalização do gasto público; e IV - Criar e monitorar indicadores de desempenho ligados às medidas propostas de racionalização do gasto público.O Laboratório de Inteligência de Gastos Públicos - LIGP será coordenado…
O Laboratório de Inteligência do Gasto Público - LIGP tem como objetivos no âmbito do ciclo de aquisições governamentais: I - Contribuir para melhoria da qualidade do gasto público; II - Elaborar estudos ligados à temática da qualidade do gasto público; III - Propor medidas de racionalização do gasto público; e IV - Criar e monitorar indicadores de desempenho ligados às medidas propostas de racionalização do gasto público.O Laboratório de Inteligência de Gastos Públicos - LIGP será coordenado pela Subsecretaria de Compras Governamentais - SCG/SAGA, em conjunto com as demais Subsecretarias da SEFP, a quem compete: I - Coletar, organizar e analisar dados ligados ao ciclo de aquisições governamentais; II - Elaborar relatórios e painéis gerenciais ligados ao ciclo de aquisições governamentais; III - Disponibilizar estudos ligados à temática da qualidade do gasto em áreas específicas integrantes do ciclo de aquisições governamentais; IV - Propor integrações e ou modificações de sistemas transacionais no âmbito do ciclo de aquisições governamentais; V - Apresentar portfólio de medidas de racionalização do gasto público no âmbito do ciclo de aquisições governamentais; VI - Criar e monitorar indicadores de desempenho ligados às medidas de racionalização do gasto propostas no âmbito do ciclo de aquisições governamentais; e VII - Dar publicidade aos trabalhos realizados no âmbito do LIGP no Portal de Compras do Distrito Federal..
Situação: Em andamento; Natureza: Desenvolvimento. -
Data Science : Programa COMPRASDF
-
A publicação do Decreto nº 37.729, de 26 de outubro de 2016, pela Subsecretaria de Compras Governamentais (SCG) da Secretaria de Fazenda, Planejamento, Orçamento e Gestão (SEFP), inaugurou uma nova era na gestão das compras públicas no Governo do Distrito Federal (GDF). Institui-se o Programa de Gestão de Compras Governamentais do Distrito Federal - COMPRASDF aplicável às aquisições e à contratação de serviços no âmbito distrital. O referido programa reúne ações que visam modernizar o ciclo de…
A publicação do Decreto nº 37.729, de 26 de outubro de 2016, pela Subsecretaria de Compras Governamentais (SCG) da Secretaria de Fazenda, Planejamento, Orçamento e Gestão (SEFP), inaugurou uma nova era na gestão das compras públicas no Governo do Distrito Federal (GDF). Institui-se o Programa de Gestão de Compras Governamentais do Distrito Federal - COMPRASDF aplicável às aquisições e à contratação de serviços no âmbito distrital. O referido programa reúne ações que visam modernizar o ciclo de compras públicas e aperfeiçoar o gerenciamento da cadeia integrada de suprimentos dos órgãos e entidades do Poder Executivo do Governo do Distrito Federal. Frente ao desafio de modernização, foi instituído o Portal de Compras Governamentais (PCG) do Distrito Federal, desenvolvido em plataforma web, que permitirá o gerenciamento online dos procedimentos licitatórios, da gestão contratual e da gestão de suprimentos dos órgãos e entidades do Distrito Federal. O referido portal é integrado pelos seguintes sistemas corporativos (SC): Sistema de Gestão de Compras Governamentais (e-ComprasDF), Sistema de Gestão de Contratos (e-ContratosDF) e Sistema de Gestão de Suprimentos (e-SupriDF). Focado no desenvolvimento futuro de ciência dos dados, com uso de ferramentas de análise de dados, mineração de dados, aprendizado de máquina, inteligência artificial, algoritmos evolutivos, internet das coisas (IoT), em e ao processo decisório e ao desenho de políticas públicas de aquisições governamentais do Distrito Federal..
Situação: Concluído; Natureza: Desenvolvimento.
Reconocimientos y premios
-
6 ª Olimpíada de Matemática do Tesouro Nacional - Medalhista de Bronze
Secretaria do Tesouro Nacional - Ministério da Economia
-
Mérito Acadêmico - 1º Lugar da Turma de Graduação da UFPR - 2002 - IRA 0,8708
Universidade Federal do Paraná - UFPR
Índice de Rendimento Acumulado - IRA: 0,8708
Idiomas
-
Português
Competencia bilingüe o nativa
-
English
Competencia básica profesional
-
Espanhol
Competencia básica profesional
Recomendaciones recibidas
9 personas han recomendado a Leonardo
Unirse para verloMás actividad de Leonardo
-
Yesterday we had the pleasure to host a meeting on the topic of cyber inequity. According to the Global Cybersecurity Outlook 2025, 35% of small…
Yesterday we had the pleasure to host a meeting on the topic of cyber inequity. According to the Global Cybersecurity Outlook 2025, 35% of small…
Recomendado por Leonardo Ferreira
Perfiles similares
-
Mauro R.
Executive Controllership Manager
Conectar -
Renata Degásperi
Gerente de Compliance Corporativo | Compliance Officer | Especialista em Compliance |
Conectar -
Rafael Martins
Head of Finance and istrative at Kingsley Gate
Conectar -
Oliveira Lino
Auditor especialista senior na Enel Brasil | MBA em Gestão de Riscos e Compliance
Conectar -
Marcus Aucelio
Diretor Executivo Financeiro / CFO / Diretor Financeiro / Consultor
Conectar -
Cleber Paiva
Managed Services Director - LATAM
Conectar -
Cintia Sayuri Sato
Consultor de tecnologia da informação na ERP Clouder
Conectar -
Gabriel Nunes
Conectar -
Ulisses Martins
Conectar -
Sávio Neves
Bacharelado em Estatística | Universidade de Brasília
Conectar
Otras personas con el nombre de Leonardo Ferreira en Brasil
-
Leonardo Ferreira
Partner Development Manager - Channel Partners at Microsoft
-
Leonardo Ferreira 🏳️🌈
-
Leonardo Ferreira
Technical Solutions Architect Brazil - Enterprise Networking / Network Transformation
-
Leonardo Ferreira
Project Management | Revenue Operations | Marketing | Team Leader
Hay 6163 personas más con el nombre de Leonardo Ferreira en Brasil en LinkedIn
Ver a otras personas con el nombre de Leonardo Ferreira