-
Notifications
You must be signed in to change notification settings - Fork 1
/
index.html
358 lines (322 loc) · 19.6 KB
/
index.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
<!DOCTYPE html>
<!-- saved from url=(0035)http://tweettracker.fulton.asu.edu/ -->
<html class=" js flexbox canvas canvastext webgl no-touch geolocation postmessage websqldatabase indexeddb hashchange history draganddrop websockets rgba hsla multiplebgs backgroundsize borderimage borderradius boxshadow textshadow opacity cssanimations csscolumns cssgradients cssreflections csstransforms csstransforms3d csstransitions fontface generatedcontent video audio localstorage sessionstorage webworkers applicationcache svg inlinesvg smil svgclippaths" lang="en"><!--<![endif]--><head><meta http-equiv="Content-Type" content="text/html; charset=UTF-8"><style type="text/css">.gm-style .gm-style-mtc label,.gm-style .gm-style-mtc div{font-weight:400}
</style><link type="text/css" rel="stylesheet" href="./xai_website_files/css"><style type="text/css">.gm-style .gm-style-cc span,.gm-style .gm-style-cc a,.gm-style .gm-style-mtc div{font-size:10px}
</style><style type="text/css">@media print { .gm-style .gmnoprint, .gmnoprint { display:none }}@media screen { .gm-style .gmnoscreen, .gmnoscreen { display:none }}</style><style type="text/css">.gm-style-pbc{transition:opacity ease-in-out;background-color:rgba(0,0,0,0.45);text-align:center}.gm-style-pbt{font-size:22px;color:white;font-family:Roboto,Arial,sans-serif;position:relative;margin:0;top:50%;-webkit-transform:translateY(-50%);-ms-transform:translateY(-50%);transform:translateY(-50%)}
</style>
<!-- Use the .htaccess and remove these lines to avoid edge case issues.
More info: h5bp.com/b/378 -->
<meta http-equiv="X-UA-Compatible" content="IE=edge,chrome=1">
<title>XAI at TAMU</title>
<meta name="description" content="">
<!-- Mobile viewport optimized: h5bp.com/viewport -->
<!-- meta name="viewport" content="width=device-width,initial-scale=1" -->
<!-- Place favicon.ico and apple-touch-icon.png in the root directory: mathiasbynens.be/notes/touch-icons -->
<link rel="stylesheet" href="./xai_website_files/style.css">
<!-- More ideas for your <head> here: h5bp.com/d/head-Tips -->
<!-- All JavaScript at the bottom, except this Modernizr build.
Modernizr enables HTML5 elements & feature detects for optimal performance.
Create your own custom Modernizr build: www.modernizr.com/download/ -->
<script src="./xai_website_files/ga.js.download"></script><script src="./xai_website_files/modernizr-2.0.6.min.js.download"></script>
<script type="text/javascript" charset="UTF-8" src="./xai_website_files/common.js.download"></script><script type="text/javascript" charset="UTF-8" src="./xai_website_files/map.js.download"></script><script type="text/javascript" charset="UTF-8" src="./xai_website_files/util.js.download"></script><script type="text/javascript" charset="UTF-8" src="./xai_website_files/infowindow.js.download"></script><style type="text/css">.gm-style {
font: 400 11px Roboto, Arial, sans-serif;
text-decoration: none;
}
.gm-style img { max-width: none; }</style><script type="text/javascript" charset="UTF-8" src="./xai_website_files/onion.js.download"></script><script type="text/javascript" charset="UTF-8" src="./xai_website_files/controls.js.download"></script><script type="text/javascript" charset="UTF-8" src="./xai_website_files/marker.js.download"></script><script type="text/javascript" charset="UTF-8" src="./xai_website_files/stats.js.download"></script></head>
<body>
<!-- Prompt IE 6 users to install Chrome Frame. Remove this if you support IE 6.
chromium.org/developers/how-tos/chrome-frame-getting-started -->
<!--[if lt IE 7 ]>
<p class=chromeframe>Your browser is <em>ancient!</em> <a href="http://microsoft.com/ie">Upgrade to a newer browser</a> or <a href="http://www.google.com/chromeframe/?redirect=true"> install Google Chrome Frame</a> to experience this site. </p>
<![endif]-->
<header class="container">
<div class="hero-unit">
<h1>XAI</h1>
<p>
The target of XAI is to help end users understand the rationale of AI system's decisions. <br>
Our research targets an <strong class="red"> Interpretable Deep Learning End-to-End (IDLEE) </strong> framework. <br>
It enables end users to <strong class="blue">understand</strong> the rationale of learning <strong class="orange"> models </strong> and <strong class="green"> predictions </strong>.
</p>
</div>
</header>
<div role="main" class="container">
<div class="row separated">
<div class="span6">
<h2><I>Why need XAI?</I></h2>
<p>
With the aid of interpretability, people can better manage and appropriately trust AI systems, which is important for making decisions in the high-stake domains.
</p>
<h2><I>Where may need XAI?</I></h2>
<h3>Health Infomatics</h3>
<p>
People may need more interpretations towards health-related areas, including medical diagnosis, electronic health record (EHR), healthcare industry and etc.
</p>
<h3>Social Infomatics</h3>
<p>
More interpretability could also contribute to people's social life, and help to better detect as well as filter tons of spammers like fake news in social media.
</p>
</div>
<div class="span6">
<div class="carousel" id="myCarousel">
<!-- Carousel items -->
<div class="carousel-inner">
<div class="item active">
<img src="./xai_website_files/imaging medical.jpg" alt="Medical Imaging">
<div class="carousel-caption">
<p>
Medical imaging needs to show why and how a specific diagnosis come out.
</p>
</div>
</div>
<div class="item">
<img src="./xai_website_files/ehr.jpg" alt="EHR">
<div class="carousel-caption">
<p>
EHR could provide more evidence to people about the diagnosis and treatment.
</p>
</div>
</div>
<div class="item">
<img src="./xai_website_files/healthcare policy.jpg" alt="Healthcare Policy">
<div class="carousel-caption">
<p>
Healthcare policy system still needs more reasonable explanations to public society.
</p>
</div>
</div>
<div class="item">
<img src="./xai_website_files/EmailSpam.jpg" alt="Spammer">
<div class="carousel-caption">
<p>
Better interpretability of spammer detector could help people to trust classification results.
</p>
</div>
</div>
<div class="item">
<img src="./xai_website_files/fake news.jpg" alt="Fake News">
<div class="carousel-caption">
<p>
People need to identify the fake news with relevant evidences or justifications.
</p>
</div>
</div>
</div>
<a class="carousel-control left" href="http://tweettracker.fulton.asu.edu/#myCarousel" data-slide="prev">‹</a>
<a class="carousel-control right" href="http://tweettracker.fulton.asu.edu/#myCarousel" data-slide="next">›</a>
</div>
</div>
</div>
<div class="row separated">
<div class="span12"><h2>Our Current Work</h2></div>
<div class="span4">
<div class="carousel-inner">
<div class="item">
<img src="./xai_website_files/work1.jpg" alt="Text Classification">
<div class="carousel-caption">
<p>
Our System can classify the health text into different catergories with interpretations.
</p>
</div>
</div>
</div>
</div>
<div class="span4">
<div class="carousel-inner">
<div class="item">
<img src="./xai_website_files/work2.jpg" alt="Image Classification">
<div class="carousel-caption">
<p>
Our system can classify the general images providing with the corresponding reason.
</p>
</div>
</div>
</div>
</div>
<div class="span4">
<div class="carousel-inner">
<div class="item">
<img src="./xai_website_files/work3.jpg" alt="Fake News Detection">
<div class="carousel-caption">
<p>
Our system aims to effectively detect and interpret fake news in social media.
</p>
</div>
</div>
</div>
</div>
</div>
<div class="row">
<div class="span4">
<h4 align="center">
Work 1: <br>
<strong class="green">Interpretable Health Text Classification</strong> <br>
(Refining...)
</h4>
<p align="center">
Our interpretable health text classifier could help to process medical text, for example, in EHR. Each input sentence could be classified into 3 categories, i.e. <I>Medication</I>, <I>Symptom</I> and <I>Background</I>. Besides, the dominated features and discriminative patterns for each classification would also be provided as interpretations. Further visualizations are also applied for user-friendly interaction.
</p>
</div>
<div class="span4">
<h4 align="center">Work 2: <br>
<strong class="orange">Interpretable Image Classification</strong> <br>
(Refining...)
</h4>
<p align="center">
Our interpretable image classifier has two parts of functionalities. First, it can interpret the deep classification model by using some <I>shallow models</I> such as linear model and gradient boosting tree model. Second, it can interpret the predictions generated by deep classifier, and show the relevant hihgly-weighted <I>super-pixels</I> as corresponding interpretations.
</p>
</div>
<div class="span4">
<h4 align="center">Work 3: <br>
<strong class="blue">Interpretable Fake News Detection</strong> <br>
(Refining...)
</h4>
<p align="center">
In this work, we aim to detect fake news on popular news websites as well as social media, and provide different forms of interpretations such as <I>Attribute Significance</I>, <I>Word/Phrase Attribution</I>, <I>Linguistic Feature</I> and <I>Supporting Examples</I>. Further human studies are conducted to guarantee the effectiveness of our system in real cases. More improvement will be posted soon.
</p>
</div>
</div>
<div class="row separated">
<div class="span15">
<h3>Publications</h3>
<p>
</p><ul>
<li>Zepeng Huo, Xiao Huang, and Xia Hu . "<a href="./XAI_papers/Link Prediction with Personalized Social Influence.pdf">Link Prediction with Personalized Social Influence</a>", AAAI'18.</li>
<li>Xiao Huang, Qingquan Song, Jundong Li and Xia Hu. "<a href="./XAI_papers/Exploring Expert Cognition for Attributed Network Embedding.pdf">Exploring Expert Cognition for Attributed Network Embedding</a>", WSDM'18.</li>
<li>Jun Gao, Ninghao Liu, Mark Lawley and Xia Hu. "<a href="./XAI_papers/An Interpretable Classification Framework for Information Extraction from Online Healthcare Forums.pdf">An Interpretable Classification Framework for Information Extraction from Online Healthcare Forums</a>", Journal of Healthcare Engineering Volume 2017, Article ID 2460174.</li>
<li>Ninghao Liu, Xiao Huang, and Xia Hu. "<a href="./XAI_papers/Accelerated Local Anomaly Detection via Resolving Attributed Networks.pdf">Accelerated Local Anomaly Detection via Resolving Attributed Networks</a>", Proceedings of IJCAI'17.</li>
<li>Ninghao Liu, Donghwa Shin and Xia Hu. "<a href="./XAI_papers/Contextual Outlier Interpretation.pdf">Contextual Outlier Interpretation</a>", Proceedings of IJCAI'18.</li>
<li>Mengnan Du, Ninghao Liu, Qingquan Song, and Xia Hu. "<a href="./XAI_papers/p1358-du.pdf">Towards Explanation of DNN-based Prediction with Guided Feature Inversion</a>", Proceedings of KDD'18.</li>
<li>Ninghao Liu, Xiao Huang, Jundong Li, and Xia Hu. "<a href="./XAI_papers/KDD18_emb.pdf">On Interpretation of Network Embedding via Taxonomy Induction</a>", Proceedings of KDD'18.</li>
<li>Ninghao Liu, Hongxia Yang, and Xia Hu. "<a href="./XAI_papers/KDD18_adv.pdf">Adversarial Detection with Model Interpretation</a>", Proceedings of KDD'18.</li>
<li>Fan Yang, Ninghao Liu, Suhang Wang, and Xia Hu. "<a href="./XAI_papers/SEP_ICDM.pdf">Towards Interpretation of Recommender Systems with Sorted Explanation Paths</a>", Proceedings of ICDM'18.</li>
<li>Hao Yuan, Yongjun Chen, Xia Hu, and Shuiwang Ji. "<a href="./XAI_papers/AAAI_YH.pdf">Interpreting Deep Models for Text Analysis via Optimization and Regularization Methods</a>", Proceedings of AAAI'19.</li>
<li>Ninghao Liu, Mengnan Du, and Xia Hu. "<a href="./XAI_papers/p60-liu.pdf">Representation Interpretation with Spatial Encoding and Multimodal Analytics</a>", Proceedings of WSDM'19.</li>
<li>Yin Zhang, Ninghao Liu, Shuiwang Ji, James Caverlee, and Xia Hu. "<a href="./XAI_papers/PAKDD_ZY.pdf">An Interpretable Neural Model with Interactive Stepwise Influence</a>", Proceedings of PAKDD'19.</li>
<li>Mengnan Du, Ninghao Liu, Fan Yang, Shuiwang Ji, and Xia Hu. "<a href="./XAI_papers/RNN_Attribution.pdf">On Attribution of Recurrent Neural Network Predictions via Additive Decomposition</a>", Proceedings of WWW'19.</li>
<li>Fan Yang, Shiva Pentyala, Sina Mohseni, Mengnan Du, Hao Yuan, Rhema Linder, Eric Ragan, Shuiwang Ji, and Xia Hu. "<a href="./XAI_papers/WebConf19_XFake.pdf">XFake: Explainable Fake News Detector with Visualizations</a>", Proceedings of WWW'19 (Demo).</li>
</ul>
<p></p>
</div>
</div>
<div class="row separated">
<div class="span12">
<h3>Try our online XAI fake news demo (current version) <a href="http://csedatasrv.cs.tamu.edu:3000/">here</a>!</h3> <br>
<h3>If the link above doesn't work, watch our demo video on YouTube (<a href="https://www.youtube.com/watch?v=U7Lrb5baeYs&t" target="_blank">Interpretable HealthText/Image Classification</a> and <a href="https://youtu.be/Y_E0K-M3ZNs" target="_blank"> Interpretable Fake News Detection</a> )!</h3>
</div>
</div>
<div class="row separated">
<div class="span12"><h3>These research are conducted by the <a href="http://faculty.cs.tamu.edu/xiahu/" target="_blank">DATA lab</a> & <a href="http://people.tamu.edu/~sji/" target="_blank">DIVE lab</a> at Texas A & M University, as well as the <a href="https://www.cise.ufl.edu/~eragan/" target="_blank">INDIE lab</a> at University of Florida.</h3></div>
</div>
<div class="row separated">
<div class="span9">
<div class="thumbnails">
<div class="span3">
<div class="thumbnail">
<img src="./xai_website_files/Hu.jpg" alt="Photo" height="150px" style="margin-left:20px;">
<div class="caption" align="center">
Dr. Xia (Ben) Hu is the director of the DATA Lab.
</div>
</div>
</div>
<div class="span3">
<div class="thumbnail">
<img src="./xai_website_files/Ji.jpg" alt="Photo" height="150px" style="margin-left:35px;">
<div class="caption" align="center">
Dr. Shuiwang Ji is the director of the DIVE Lab.
</div>
</div>
</div>
<div class="span3">
<div class="thumbnail">
<img src="./xai_website_files/ragan.jpg" alt="Photo" height="150px" style="margin-left:35px;">
<div class="caption" align="center">
Dr. Eric Ragan is the director of the INDIE Lab.
</div>
</div>
</div>
<div class="span3">
<div class="thumbnail">
<img src="./xai_website_files/lnh.jpg" alt="Photo" height="150px" style="margin-left:20px;">
<div class="caption" align="center">
Ninghao Liu is a Ph.D. student in the DATA Lab.
</div>
</div>
</div>
<div class="span3">
<div class="thumbnail">
<img src="./xai_website_files/sinamohseni.jpg" alt="Photo" height="150px" style="margin-left:20px;">
<div class="caption" align="center">
Sina Mohseni is a Ph.D. student in the INDIE Lab.
</div>
</div>
</div>
<div class="span3">
<div class="thumbnail">
<img src="./xai_website_files/Fan Yang.jpg" alt="Photo" height="150px" style="margin-left:20px;">
<div class="caption" align="center">
Fan Yang is a Ph.D. student in the DATA Lab.
</div>
</div>
</div>
<div class="span3">
<div class="thumbnail">
<img src="./xai_website_files/dmn.jpg" alt="Photo" height="150px" style="margin-left:20px;">
<div class="caption" align="center">
Mengnan Du is a Ph.D. student in the DATA Lab.
</div>
</div>
</div>
<div class="span3">
<div class="thumbnail">
<img src="./xai_website_files/Hao.jpg" alt="Photo" height="150px" style="margin-left:52px;">
<div class="caption" align="center">
Hao Yuan is a Ph.D. student in the DIVE Lab.
</div>
</div>
</div>
<div class="span3">
<div class="thumbnail">
<img src="./xai_website_files/Shiva.jpg" alt="Photo" height="150px" style="margin-left:20px;">
<div class="caption" align="center">
Shiva Kumar Pentyala is a M.S. student in the DATA Lab.
</div>
</div>
</div>
<div class="span9">
<br>
<strong>Contact Us:</strong> [email protected] | 979-845-8873 |
TAMU, 400 Bizzell St,
College Station, TX 77843-3112
</div>
</div>
</div>
<div class="span3">
<p><img src="./xai_website_files/tamu.jpg" alt="TAMU CSE Logo" height="40px"></p>
<p><img src="./xai_website_files/CISE_logo.png" id="onr-logo" alt="UFL CISE Logo" height="80px"></p>
<p><img src="./xai_website_files/xai.jpg" id="onr-logo" alt="XAI" height="60px"></p>
<p>
<em>This project is funded by the <b> Defense Advanced Research Projects Agency (DARPA) </b> </em></p>
</div>
</div>
</div>
<footer class="container">
© 2017 XAI in Texas A & M
</footer>
<!-- JavaScript at the bottom for fast page loading -->
<!-- Grab Google CDN's jQuery, with a protocol relative URL; fall back to local if offline -->
<script src="./xai_website_files/jquery.min.js.download"></script>
<script>window.jQuery || document.write('<script src="js/libs/jquery-1.7.1.min.js"><\/script>')</script>
<!-- scripts concatenated and minified via build script -->
<script defer="" src="./xai_website_files/plugins.js.download"></script>
<script defer="" src="./xai_website_files/script.js.download"></script>
<!-- end scripts -->
<script type="text/javascript" src="./xai_website_files/js"></script>
<!-- Asynchronous Google Analytics snippet. Change UA-XXXXX-X to be your site's ID.
mathiasbynens.be/notes/async-analytics-snippet -->
<script>
var _gaq=[['_setAccount','UA-XXXXX-X'],['_trackPageview']];
(function(d,t){var g=d.createElement(t),s=d.getElementsByTagName(t)[0];
g.src=('https:'==location.protocol?'//ssl':'//www')+'.google-analytics.com/ga.js';
s.parentNode.insertBefore(g,s)}(document,'script'));
</script>
</body></html>