-
Notifications
You must be signed in to change notification settings - Fork 1
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Another attempt to fix the zoom issue
- Loading branch information
Showing
1 changed file
with
156 additions
and
0 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Original file line number | Diff line number | Diff line change |
---|---|---|
@@ -0,0 +1,156 @@ | ||
<!DOCTYPE html> | ||
<html lang="en"> | ||
<head> | ||
<meta charset="UTF-8"> | ||
<meta name="viewport" content="width=device-width, initial-scale=1"> | ||
<title>Language through the Lens of AI: The Story of Embeddings</title> | ||
<style> | ||
/* Reset and base styles */ | ||
* { | ||
margin: 0; | ||
padding: 0; | ||
box-sizing: border-box; | ||
} | ||
|
||
/* Base font size for relative units */ | ||
html { | ||
font-size: 16px; | ||
} | ||
|
||
body { | ||
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Arial, sans-serif; | ||
line-height: 1.6; | ||
color: #2c3e50; | ||
background-color: #f8f9fa; | ||
margin: 0; | ||
padding: 1rem; | ||
} | ||
|
||
/* Simple fluid container */ | ||
.container { | ||
margin: 0 auto; | ||
padding: 0 1rem; | ||
} | ||
|
||
/* Section styles */ | ||
.section { | ||
margin: 2rem 0; | ||
padding: 1rem; | ||
background: white; | ||
border-radius: 8px; | ||
box-shadow: 0 2px 4px rgba(0,0,0,0.1); | ||
} | ||
|
||
/* Typography */ | ||
h1 { | ||
font-size: 2em; | ||
margin-bottom: 1rem; | ||
line-height: 1.2; | ||
} | ||
|
||
h3 { | ||
font-size: 1.5em; | ||
margin-bottom: 1rem; | ||
line-height: 1.2; | ||
} | ||
|
||
p { | ||
margin-bottom: 1rem; | ||
} | ||
|
||
/* Links */ | ||
a { | ||
color: #3498db; | ||
text-decoration: none; | ||
} | ||
|
||
a:hover { | ||
color: #2980b9; | ||
} | ||
|
||
/* Title section */ | ||
.title-section { | ||
text-align: center; | ||
margin-bottom: 2rem; | ||
} | ||
|
||
.contact-info { | ||
margin-top: 1rem; | ||
font-style: italic; | ||
} | ||
|
||
/* iframe container */ | ||
.iframe-container { | ||
position: relative; | ||
width: 100%; | ||
padding-bottom: 56.25%; | ||
margin: 1rem 0; | ||
} | ||
|
||
.iframe-container iframe { | ||
position: absolute; | ||
top: 0; | ||
left: 0; | ||
width: 100%; | ||
height: 100%; | ||
border: none; | ||
border-radius: 4px; | ||
} | ||
|
||
/* Simple media queries */ | ||
@media screen and (max-width: 768px) { | ||
body { | ||
padding: 0.5rem; | ||
} | ||
|
||
.container { | ||
padding: 0 0.5rem; | ||
} | ||
|
||
.section { | ||
padding: 1rem 0.5rem; | ||
} | ||
|
||
h1 { | ||
font-size: 1.75em; | ||
} | ||
|
||
h3 { | ||
font-size: 1.25em; | ||
} | ||
} | ||
</style> | ||
</head> | ||
<body> | ||
<div class="container"> | ||
<section class="section title-section"> | ||
<h1>Language through the Lens of AI: The Story of Embeddings</h1> | ||
<div class="contact-info"> | ||
Authored by <strong>Ken Kahn</strong><br> | ||
Contact: <a href="mailto:[email protected]">[email protected]</a> | ||
</div> | ||
</section> | ||
|
||
<section class="section"> | ||
<p>In the world of natural language processing, embeddings transform words and sentences into sequences of numbers, allowing computers to grasp language.</p> | ||
<p>This technology powers tools like Siri and Alexa, and translation services like Google Translate.</p> | ||
<p>Generative AI systems, such as ChatGPT, Bard, and DALL-E, leverage these embeddings to understand and generate human-like text, create art, or answer complex queries. These advancements showcase the pivotal role of embeddings in bridging human communication with machine intelligence.</p> | ||
</section> | ||
|
||
<section class="section"> | ||
<h3>Hand-Crafted Embeddings</h3> | ||
<p>In the early days of language processing, before the advent of advanced machine learning techniques, embeddings were meticulously crafted by hand. Linguists and computer scientists collaborated to create these embeddings, embedding each word into a numerical space based on its meaning and context. This process involved analyzing the relationships between words and manually assigning values to capture these relationships.</p> | ||
<p>For example, words with similar meanings would be placed close together in this numerical space, while those with different meanings would be positioned further apart. This method, though innovative, had its limitations. It was time-consuming and could not easily adapt to the nuances of language and evolving vocabulary. However, these early endeavors laid the groundwork for the more sophisticated, automated embedding techniques that are used in NLP today.</p> | ||
<div class="iframe-container"> | ||
<!-- iframe content would go here --> | ||
</div> | ||
</section> | ||
|
||
<section class="section"> | ||
<h3>Further Reading and References</h3> | ||
<p>To delve deeper into the world of NLP and embeddings, consider exploring additional resources and academic papers. These materials can offer a more in-depth understanding of the theories and practical applications of NLP, including the latest advancements and research findings. Academic journals, online courses, and specialized blogs in this field are great places to start for those interested in furthering their knowledge.</p> | ||
<p>To learn more about word embeddings, visit the <a href="https://en.wikipedia.org/wiki/Word_embedding" target="_blank">Wikipedia page on Word Embeddings</a>.</p> | ||
</section> | ||
</div> | ||
</body> | ||
</html> |