Skip to content

Commit

Permalink
Switched to 4o from Claude to fix this
Browse files Browse the repository at this point in the history
  • Loading branch information
ToonTalk committed Nov 24, 2024
1 parent b6026ea commit 8abb5d3
Showing 1 changed file with 207 additions and 0 deletions.
207 changes: 207 additions & 0 deletions apps/embeddings/responsive-html-v2-updated.html
Original file line number Diff line number Diff line change
@@ -0,0 +1,207 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0, maximum-scale=5.0, user-scalable=yes">
<title>Language through the Lens of AI: The Story of Embeddings</title>
<style>
/* Base styles */
:root {
--max-width: 1200px;
--body-padding: 1.25rem;
--primary-color: #2c3e50;
--background-color: #f8f9fa;
}

* {
margin: 0;
padding: 0;
box-sizing: border-box;
word-wrap: break-word;
overflow-wrap: break-word;
-webkit-text-size-adjust: 100%;
}

html {
width: 100%;
overflow-x: hidden;
}

body {
font-family: -apple-system, BlinkMacSystemFont, 'Segoe UI', Roboto, Oxygen, Ubuntu, Cantarell, sans-serif;
line-height: 1.6;
color: var(--primary-color);
background-color: var(--background-color);
padding: var(--body-padding);
width: 100%;
max-width: 100%;
text-size-adjust: 100%;
-webkit-text-size-adjust: 100%;
-moz-text-size-adjust: 100%;
}

/* Container styles */
.container {
max-width: var(--max-width);
margin: 0 auto;
padding: 0 1rem;
width: 100%;
}

/* Ensure long words (like emails or URLs) don't overflow */
p, h1, h2, h3, a {
max-width: 100%;
overflow-wrap: break-word;
word-wrap: break-word;
-ms-word-break: break-all;
word-break: break-word;
-ms-hyphens: auto;
-moz-hyphens: auto;
-webkit-hyphens: auto;
hyphens: auto;
}

/* Section styles */
.section {
margin: 2rem 0;
padding: 2rem;
background: white;
border-radius: 8px;
box-shadow: 0 2px 4px rgba(0,0,0,0.1);
}

/* Title section */
.title-section {
text-align: center;
margin-bottom: 3rem;
}

.title-section h1 {
font-size: 2.5rem;
margin-bottom: 1rem;
}

/* Typography */
h1, h2, h3 {
line-height: 1.2;
margin-bottom: 1rem;
}

p {
margin-bottom: 1.5rem;
}

/* Links */
a {
color: #3498db;
text-decoration: none;
transition: color 0.3s ease;
}

a:hover {
color: #2980b9;
}

/* iframe container */
.iframe-container {
position: relative;
width: 100%;
padding-bottom: 56.25%; /* 16:9 aspect ratio */
margin: 2rem 0;
}

.iframe-container iframe {
position: absolute;
top: 0;
left: 0;
width: 100%;
height: 100%;
border: none;
border-radius: 4px;
}

/* Contact info */
.contact-info {
font-style: italic;
margin-top: 1rem;
}

/* Responsive breakpoints */
@media (max-width: 768px) {
:root {
--body-padding: 0.75rem;
}

.section {
padding: 1.5rem;
margin: 1rem 0;
}

.title-section h1 {
font-size: 2rem;
}

h2 {
font-size: 1.5rem;
}

h3 {
font-size: 1.25rem;
}
}

@media (max-width: 480px) {
:root {
--body-padding: 0.5rem;
}

.section {
padding: 1rem;
margin: 0.75rem 0;
}

.title-section h1 {
font-size: 1.75rem;
}

.iframe-container {
padding-bottom: 75%; /* Adjusted for mobile */
}
}
</style>
</head>
<body>
<div class="container">
<section class="section title-section">
<h1>Language through the Lens of AI: The Story of Embeddings</h1>
<div class="contact-info">
Authored by <strong>Ken Kahn</strong><br>
Contact: <a href="mailto:[email protected]">[email protected]</a>
</div>
</section>

<section class="section">
<p>In the world of natural language processing, embeddings transform words and sentences into sequences of numbers, allowing computers to grasp language.</p>
<p>This technology powers tools like Siri and Alexa, and translation services like Google Translate.</p>
<p>Generative AI systems, such as ChatGPT, Bard, and DALL-E, leverage these embeddings to understand and generate human-like text, create art, or answer complex queries. These advancements showcase the pivotal role of embeddings in bridging human communication with machine intelligence.</p>
</section>

<section class="section">
<h3>Hand-Crafted Embeddings</h3>
<p>In the early days of language processing, before the advent of advanced machine learning techniques, embeddings were meticulously crafted by hand. Linguists and computer scientists collaborated to create these embeddings, embedding each word into a numerical space based on its meaning and context. This process involved analyzing the relationships between words and manually assigning values to capture these relationships.</p>
<p>For example, words with similar meanings would be placed close together in this numerical space, while those with different meanings would be positioned further apart. This method, though innovative, had its limitations. It was time-consuming and could not easily adapt to the nuances of language and evolving vocabulary. However, these early endeavors laid the groundwork for the more sophisticated, automated embedding techniques that are used in NLP today.</p>
<div class="iframe-container">
<!-- iframe content would go here -->
</div>
</section>

<!-- Additional sections follow the same pattern -->

<section class="section">
<h3>Further Reading and References</h3>
<p>To delve deeper into the world of NLP and embeddings, consider exploring additional resources and academic papers. These materials can offer a more in-depth understanding of the theories and practical applications of NLP, including the latest advancements and research findings. Academic journals, online courses, and specialized blogs in this field are great places to start for those interested in furthering their knowledge.</p>
<p>To learn more about word embeddings, visit the <a href="https://en.wikipedia.org/wiki/Word_embedding" target="_blank">Wikipedia page on Word Embeddings</a>.</p>
</section>
</div>
</body>
</html>

0 comments on commit 8abb5d3

Please sign in to comment.