-
Notifications
You must be signed in to change notification settings - Fork 73
/
extSamples.json
195 lines (176 loc) · 10.8 KB
/
extSamples.json
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
{
"name": "Add REST endpoints to a Python application",
"description": "This tutorial includes a Python notebook showing how to add a web service to a streaming application using the Streams jobs service",
"language": ["Python"],
"tags": ["cloud pak for data","rest"],
"external": true,
"category": ["2"],
"featured":true,
"url": "https://developer.ibm.com/tutorials/access-streaming-data-with-rest-services/"
},
{
"name": "Score streaming data with R and Streams flows",
"description": "This shows how to use an R model to score data from a Streams flows application and how to download a newer version of the model from Cloud Object Storage.",
"language": ["SPL"],
"category": ["2","7","8"],
"blogPost": "https://medium.com/ibm-watson/real-time-forecasting-using-r-and-watson-studio-513c45abd1a9",
"url": "https://github.com/IBMStreams/sample.forecast_with_r",
"tags": ["R","r model","streams flows","publish","microservice","model","ai"],
"featured": true,
"operators": ["Publish","RScript","R","ObjectStorageScan", "ObjectStorageSource"]
},
{
"name": "Store high volumes of streaming data from IBM Event Streams in Cloud Object Storage",
"description": "Read incoming JSON data from IBM Event Streams and write it to the IBM Cloud Object Storage (COS). Also demonstrates exactly-once processing and Kafka consumer groups.",
"language": ["SPL"],
"category": ["2"],
"url": "https://github.com/IBMStreams/streamsx.objectstorage/tree/develop/demo/data.historian.event.streams.cos.exactly.once.semantics.demo",
"tags": ["event streams","message hub","json","messagehub","cos"],
"toolkits": ["objectstorage","json","messagehub"],
"external":true
},
{"name": "Use Server-Sent events (SSE) in a Python application",
"description": "Read Server-Sent events into IBM Streams with a Python topology. Uses the SSEClient to ingest data from the Wikipedia recent changes stream.",
"language": ["Python"],
"category": ["2"],
"url": "https://gist.github.com/ddebrunner/21db521909accd2ec364861964e18ae3",
"tags": ["beginner", "sse", "python"],
"toolkits": ["topology"],
"featured": true,
"external": true,
"zip": "https://gist.github.com/ddebrunner/21db521909accd2ec364861964e18ae3/archive/3b0dfda3b9d42533f16801aa0d7e269250eef440.zip"
},
{ "name": "Get started with JMS operators",
"description":"Learn how to configure the operators for use with Apache activeMQ and Websphere MQ. Includes sample connection documents.",
"language": ["SPL"],
"category": ["2","1"], "external": true,
"blogPost": "https://developer.ibm.com/streamsdev/2016/04/18/getting-started-with-jms-operators/",
"url": "http://ibmstreams.github.io/streamsx.documentation/docs/4.2/messaging/jms-operators-getting-started/",
"tags": ["messaging","mq","jmssource","jmssink"]
},
{ "name": "Create Websphere MQ binding file and queue",
"description":"This article has sample steps to create the Websphere MQ binding file to use with the JMS operators.",
"language": ["SPL"],
"category": ["2","1"], "external": true,
"blogPost": "https://developer.ibm.com/streamsdev/2016/04/18/getting-started-with-jms-operators/",
"url": "http://ibmstreams.github.io/streamsx.documentation/docs/4.2/messaging/mq-create-objects-bindings-sample/",
"tags": ["messaging","mq object","jmssource","jmssink"]
},
{ "name": "Use Event Streams operator with Streams flows",
"description":"Notebook demonstrating how to configure the Event Streams operator in Watson Studio Streams flows. Builds on the Data Historian example and sends sample data to IBM Event Streams.",
"language": ["Python"],
"category": ["2","1"], "external": true,
"blogPost": "https://developer.ibm.com/streamsdev/videos/demo-streaming-analytics-using-python-ibm-data-science-experience/",
"url": "https://dataplatform.ibm.com/exchange/public/entry/view/a87f10c5c5cd65495a2f9d880af72d7a",
"tags": ["messagehub","event streams"]
},
{ "name": "Detect invalid data in IoT devices in real time using Streams and Python",
"description":"Ingest data from IoT devices and analyze it to detect potential failures. The Streams application connects to the IoT devices through the Watson IoT platform. Includes instructions to simulate IoT device data.",
"language": ["Python"],
"category": ["2","1","7"], "external": true,
"blogPost": "https://developer.ibm.com/streamsdev/videos/demo-streaming-analytics-using-python-ibm-data-science-experience/",
"url": "https://dataplatform.ibm.com/exchange/public/entry/view/ec0aa15c6ab928b9b43ac0109d4395f1",
"tags": ["icp4d","data science", "iot", "watson iot","edgent","cloud","send commands", "topology","read event","raspberry pi","edge device","send data"]
},
{ "name": "How to retrieve data from IoT devices from a Python Notebook",
"description":"Python notebook that shows how to process data from/send commands to IoT devices. Includes instructions to simulate IoT device data.",
"language": ["Python"],
"category": ["2","1"], "external": true,
"blogPost": "https://developer.ibm.com/recipes/tutorials/connect-apache-edgent-to-the-streaming-analytics-service-using-the-watson-iot-platform/",
"url": "https://dataplatform.ibm.com/exchange/public/entry/view/ec0aa15c6ab928b9b43ac0109d9b6a73",
"tags": ["icp4d","data science", "iot", "watson iot","ibm cloud","cloud","send commands", "topology","read event","raspberry pi","edge device","send data", "streaming analytics"]
},
{"category": ["1"],
"name": "Hello World Python notebook",
"language": ["Python"], "tags": ["icp4d","topology", "ibm cloud","cloud"],
"url": "https://apsportal.ibm.com/exchange/public/entry/view/9fc33ce7301f10e21a9f92039ca9c6e8",
"blogPost": "https://developer.ibm.com/streamsdev/docs/new-in-streaming-analytics/",
"external": true,
"description": "Simple notebook showing how to connect to the Streaming Analytics service from Python. The application prints 'Hello World' to the console."
},
{"category": ["1","2"],
"name": "Ingest and analyze patient data in Python with BioPy", "language": ["Python"], "tags": ["iot", "watson iot", "ibm cloud","cloud","streaming analytics","visualize","graph","chart","patient","health","ecg"],
"url": "https://github.com/IBMStreams/streamsx.health/blob/develop/samples/HealthcareJupyterDemo/notebooks/HealthcareDemo-Distributed.ipynb",
"external": true,
"toolkits": ["Healthcare"],
"description": "Streams+Python notebook that analyzes simulated patient data using the Streams health toolkit. It also demonstrates how to visualize data in a view using Bokeh."
},
{"category": ["1","2"],
"name": "Use Numpy/Matplot/Pybrain from a Streams Python application", "language": ["Python"], "tags": [ "icp4d","iot", "watson iot","ibm cloud","cloud","visualize", "view","graph","chart","plot"],
"blogPost": "https://developer.ibm.com/streamsdev/docs/new-in-streaming-analytics/",
"url": "https://apsportal.ibm.com/exchange/public/entry/view/9fc33ce7301f10e21a9f92039ca60bb7",
"external": true,
"description": "Demonstrates applying statistical models to real time data. This notebook creates a neural network model to determine probability that an engine will fail based on its temperature. It also demonstrates how to visualize data in a view."
},
{
"name": "Detect at-risk patients using the Healthcare Analytics platform",
"description": "This simulation monitors the vital signs of 100 patients and generates an alert on the dashboard if a patient's vitals are not in the normal range. It also uses the Java Application API and the ODM rules compiler.",
"language": ["Java"],
"category": ["2","7"],
"external" : true,"featured":true,
"url":"https://github.com/IBMStreams/streamsx.health/tree/develop/samples/PatientsMonitoringDemo",
"zip": "https://github.com/IBMStreams/streamsx.health/archive/develop.zip",
"tags": ["java topology","topology","odm","rules","microservices","health"]
},
{
"name": "Simple rolling average in a Python notebook",
"description": "This notebook creates a rolling average using the Streams Python API and creates a view of the results.",
"language": ["Python"],
"tags": ["ibm cloud","cloud pak for data"],
"external": true,
"category": ["1","7"],
"featured":true,
"blogPost":"https://community.ibm.com/community/user/cloudpakfordata/blogs/natasha-dsilva1/2020/10/08/analyze-streaming-data-with-python-and-cloud-pak-f",
"url": "https://github.com/IBMStreams/sample.starter_notebooks/blob/latest/Streams-RollingAverageSample.ipynb"
},
{
"name": "Score a PMML model on streaming data",
"description": "This Python notebook for IBM Cloud Pak for Data shows how to score a PMML model within a Streams Python application.",
"language": ["Python"],
"tags": ["ibm cloud","cloud pak for data"],
"external": true,
"category": ["1","7"],
"featured":true,
"blogPost": "https://community.ibm.com/community/user/cloudpakfordata/viewdocument/score-pmml-models-in-real-time-with?CommunityKey=c0c16ff2-10ef-4b50-ae4c-57d769937235&tab=librarydocuments",
"url": "https://github.com/IBMStreams/sample.starter_notebooks/blob/latest/Streams-PMMLScoringSample.ipynb"
},
{
"name": "Connect to IBM Event Streams with Python",
"description": "This Python notebook shows the required steps to connect to Event Streams to send and receive data using the Streams Python API.",
"language": ["Python"],
"tags": ["ibm cloud","cloud pak for data","event streams","kafka"],
"external": true,
"category": ["2"],
"blogPost": "https://youtu.be/s30AtkGoIc8",
"featured":true,
"url": "https://github.com/IBMStreams/sample.starter_notebooks/blob/latest/Streams-EventStreamsSample.ipynb"
},
{
"name": "Send data to IBM Db2 Event Store",
"description": "Send streaming data to IBM Db2 Event Store from a Python notebook. The tuples are inserted as rows in a Db2 Event Store table.",
"language": ["Python"],
"tags": ["ibm cloud","cloud pak for data","eventstore"],
"external": true,
"category": ["2"],
"blogPost": "https://developer.ibm.com/streamsdev/2019/07/10/connect-to-db2-event-store/",
"url": "https://github.com/IBMStreams/sample.starter_notebooks/blob/latest/Streams-EventStoreSample.ipynb"
},
{
"name": "Connect to IBM Db2 Warehouse in a Python application",
"description": "This Python notebook shows how to connect to a DB2 Warehouse database via JDBC. Application includes table creation, SQL queries and inserts. Runs on IBM Cloud Pak for Data.",
"language": ["Python"],
"tags": ["ibm cloud","cloud pak for data","sql","jdbc"],
"external": true,
"category": ["2"],
"blogPost":"https://youtu.be/s30AtkGoIc8",
"url": "https://github.com/IBMStreams/sample.starter_notebooks/blob/latest/Streams-DatabaseSample.ipynb"
},{
"name": "Python application template",
"description": "This notebook is a template that outlines the basic steps you need to create a Python application using the Streams Python API. Runs on IBM Cloud Pak for Data.",
"language": ["Python"],
"tags": ["ibm cloud","cloud pak for data"],
"external": true,
"category": ["1"],
"featured":true,
"url": ""
}