Hi my name is Justin. 👋 I am a consultant with a passion for machine learning, analytics, research, and coding.
Here are my LinkedIn and GitHub profiles.
Education and Certifications:
TinyML - Edge device sensor data analytics:
Natural Language Processing with Attention Models:
- Credential (Includes the famous Vaswani Transformer architecture with attention mechanism.)
- Fun fact: ChatGPT uses Generative Pre-trained Transformers. The transformer architecture core concepts are: attention, self-attention, and positional encoding.
- Attention looks at all input words for output translation decisions. Self-attention understands words in context of the words around it (meaning). Positional encoding encodes the position of a word in a sequence.