-
Notifications
You must be signed in to change notification settings - Fork 1
/
swagato_2.html
443 lines (396 loc) · 28.5 KB
/
swagato_2.html
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
161
162
163
164
165
166
167
168
169
170
171
172
173
174
175
176
177
178
179
180
181
182
183
184
185
186
187
188
189
190
191
192
193
194
195
196
197
198
199
200
201
202
203
204
205
206
207
208
209
210
211
212
213
214
215
216
217
218
219
220
221
222
223
224
225
226
227
228
229
230
231
232
233
234
235
236
237
238
239
240
241
242
243
244
245
246
247
248
249
250
251
252
253
254
255
256
257
258
259
260
261
262
263
264
265
266
267
268
269
270
271
272
273
274
275
276
277
278
279
280
281
282
283
284
285
286
287
288
289
290
291
292
293
294
295
296
297
298
299
300
301
302
303
304
305
306
307
308
309
310
311
312
313
314
315
316
317
318
319
320
321
322
323
324
325
326
327
328
329
330
331
332
333
334
335
336
337
338
339
340
341
342
343
344
345
346
347
348
349
350
351
352
353
354
355
356
357
358
359
360
361
362
363
364
365
366
367
368
369
370
371
372
373
374
375
376
377
378
379
380
381
382
383
384
385
386
387
388
389
390
391
392
393
394
395
396
397
398
399
400
401
402
403
404
405
406
407
408
409
410
411
412
413
414
415
416
417
418
419
420
421
422
423
424
425
426
427
428
429
430
431
432
433
434
435
436
437
438
439
440
441
442
443
<!DOCTYPE html>
<html>
<head>
<meta name="viewport" content="width=device-width, initial-scale=1">
<style>
body {
font-family: "Lato", sans-serif;
}
.sidebar {
height: 100%;
width: 0;
position: fixed;
z-index: 1;
top: 0;
left: 0;
background-color: #111;
overflow-x: hidden;
transition: 0.5s;
padding-top: 60px;
}
.sidebar a {
padding: 8px 8px 8px 32px;
text-decoration: none;
font-size: 25px;
color: #818181;
display: block;
transition: 0.3s;
}
.sidebar a:hover {
color: #f1f1f1;
}
.sidebar .closebtn {
position: absolute;
top: 0;
right: 25px;
font-size: 36px;
margin-left: 50px;
}
.openbtn {
font-size: 20px;
cursor: pointer;
background-color: #111;
color: white;
padding: 10px 15px;
border: none;
}
.openbtn:hover {
background-color: #444;
}
#main {
transition: margin-left .5s;
padding: 16px;
}
/* On smaller screens, where height is less than 450px, change the style of the sidenav (less padding and a smaller font size) */
@media screen and (max-height: 450px) {
.sidebar {padding-top: 15px;}
.sidebar a {font-size: 18px;}
}
</style>
</head>
<body>
<div id="mySidebar" class="sidebar">
<a href="javascript:void(0)" class="closebtn" onclick="closeNav()"> X </a>
<a href="index.html">Home</a>
<a href="issues.html">Issues</a>
<a href="about.html">About Us</a>
</div>
<div id="main">
<button class="openbtn" onclick="openNav()"> MENU</button>
</div>
<script>
function openNav() {
document.getElementById("mySidebar").style.width = "250px";
document.getElementById("main").style.marginLeft = "250px";
}
function closeNav() {
document.getElementById("mySidebar").style.width = "0";
document.getElementById("main").style.marginLeft= "0";
}
</script>
<script type="text/javascript" async="" src="https://cdnjs.cloudflare.com/ajax/libs/mathjax/2.7.4/MathJax.js?config=TeX-MML-AM_CHTML">
</script>
<div align="center">
<h1>Erasure of Categories</h1>
<br>
<h3>Swagato Saha</h3>
<br>
<br>
<div align="left">
<h3>Erasure of Categories</h3>
</div>
<p>
A long-sought grievance levelled by theoretical chemistry, against pretty much the rest of chemistry,
is the latter's (allegedly) unrestrained, injudicious use of what are (allegedly) dubious, ill-defined
categories, to explain chemical phenomena. For instance, the categories of 'electrophilicity'/
'nucleophilicity', traditional to Organic Chemistry (which as a systematic field of study predates all
other kinds of Chemistry), are of course subject to extreme criticism elsewhere, as basically a
primitive attempt to articulate properties now acceptably described under 'Acids'/ 'Bases'.
Or consider the very idea of a chemical bond - it's impossible to identify within modern quantum
chemistry the familiar, benign concepts of Lewis structure.
</p>
<p>
A more critical investigation reveals that much of existing chemical concepts, apparently logical to
naïve intuition (like Lewis Structure), fails to live up to modern theoretical standards, variously
informed by Quantum Mechanics.
</p>
<p>
The rationale behind such reproaches is of course well-founded - it is very much the purpose of
theory proper to distil, from the bumbling, buzzing confusion of categories, a minimal set that not
only reproduces chemical concepts (so far prescribed as 'Rules' to account for empirical
observations), but also extends, rectifies, resituates crude chemical concepts within a sound
theoretical basis.
It is as if, in the classical structuralist sense, there exists a humble 'multiplicity' of superficial
categories on the surface, to be excavated to retrieve the precious 'Singular' buried in its depths.
“<b>One category to rule them all!</b>”
</p>
<p>
However, it would also appear that the obverse holds true. That is, isn't the baffling diversity of the
exception-riddled discourse of chemistry the ultimate sucker punch to overly presumptuous
physicalist attempts at generalisation? The devil is in the detail. One can recall here the polemic
between <b>Roald Hoffmann</b> and <b>Richard Bader</b>, both theoretical chemists, on this very issue
('Reductionism'/'Emergence' apropos Chemistry); the former in poetic appraisal of the irreducible
complexity of Chemistry.
</p>
<p>As such, <b>the multiplicity of categories implies this very impotence of the one.</b>
Consider as an example the Least Action Principle. Even if we are to neglect for the moment,
complications ushered by Quantum Mechanics, it is clear that the universality of Least Action
Principle is predicated on its simultaneous impotence - phenomena in and around us all adhere to
the Least Action Principle, yet nonetheless (or paradoxically, because of it) it provides no account of
phenomenal difference - Why does steam rise while apples fall? Why do certain objects expand
when heated whilst others shrink?</p>
<p>
This is not to say that one would observe violations of Least Action Principle in any of these cases,
but to point out the semantic uselessness of universal principles. It is only when we proceed to
determine what the 'Action to be minimised' is in each of the cases, that differences can be
expressed, and semantic relations may be developed - how similar/different two things are. As if,
the rather cumbersome task of assigning semantic designation to things, in terms of how different or
how alike they are, after all remains undone - and <b>I have merely transformed the terms of inquiry,
without tackling its semantic burden.</b>
</p>
<p>
Perhaps where this impasse is best staged is in the context of Thermodynamics in Chemistry. Now,
this is standardly known that (change in) Gibb's Free Energy (G) is an indicator of the
feasibility/spontaneity (can happen or can't happen) of a thermodynamic process. That is, all that
happens is thermodynamically spontaneous, and, all that is thermodynamically spontaneous can and
will happen. And this is a running joke of sorts, among teachers and students alike, that whenever
one finds oneself at odds and unable to conceptually justify an observation, ('Why is Fe (2+)
spontaneously oxidised to Fe (3+)?' 'Why does Mn (3+) precipitate as Mn2O3 in basic media?')
thermodynamic spontaneity is what we allude to.
</p>
<p>[The answer goes - '<i>Because it is thermodynamically favoured!</i>']</p>
<p>It is interesting nonetheless to scrutinise its 'semantic banality' - the reasoning is circular in a way.
Since the Second Law of Thermodynamics, absolute as it is, cannot be deduced in any manner
(Statistical Mechanics doesn't really do it), it is best taken as a 'Prescription' (as opposed to a
'Description') - that is, some sort of an ad-hoc 'Rule', and following the induction of such a Rule, one
can account for a host of phenomena in a self-consistent way. However, what also follows is that,
one cannot appeal to this Rule to resolve semantic questions. So, the proper answer to questions
of the above sort must typically refer to Crystal Field Theory, taking into account orbital layouts and
electron population, and provide some sort of a stability index - in other words, generate as output
the thermodynamic conditions (which are of course a priori known). To refer to a piece in the
previous issue (“Reified & Realised”), it's clear that the Semantic dimension exists only in the space
between syntactic transforms (Syntax as pertaining to Rules); in other words, it appears as if both
sides of the (figurative) equation are 'always already' determined (in the sense, Rules cannot be
deduced and must therefore be taken as the point of induction to sustain the rational constitution of
the field), and one can only interrogate, in a 'transcendental' manner, the specific terms that occur
and the relations thereby implied, without in the slightest seeking to rationally account for the
overarching Rules. (The figurative equation being, Possibility implies (and is implied by) Spontaneity.)
[Allow me a provocative digression. “If there is a God, then anything is permitted.” A remark often
attributed to French psychoanalyst Jacques Lacan, in a wonderful reversal of Dostoevsky, identifies in
the notion of God a similar 'semantic banality'. That is, the worst historical atrocities have often
sought justification or some minimal refuge in their appeal to God as a self-grounding moral agency
- 'We were enacting God's Will.' As if God's Will is mysterious and eludes interpretation as an all-
ends-permissible-Ptolemaic model, and even the worst consequences that follow a misguided
understanding and the attempt to realise so, must be minimally redeemed in scope of their noble
intentionality - 'We sinned but our intentions weren't misplaced.' And isn't the central theme of
religious discourse necessarily its evocation of God as ensuring semantic certainty, as bringing
meaning to banal existence? Yet much in the spirit of our favoured 2nd law (evoked as false
justification), anything that happens is ultimately justifiable as God's Will with God as the ultimate,
obscure gatekeeper of the horizon of meaning. Even the radical scepticism of Descartes must meet
this unfortunate fate.]</p>
<p>
In universal principles (such as Least Action, or God's Will), absolute similitude coincides with
absolute difference. Everything is like everything else insofar as they refer to instances of Action
Minimisation. As also, everything is unlike everything else, in the most naked phenomenal
differential sense. This is a standard theme in Hegel's Science of Logic, that I'll be variously returning
to. The possibility of Semantics lies in the space between these two extremes (Absolute Similitude
& Absolute Difference), which are of course two sides to the same coin (Coincidence).
[The atheist however, is required to resist the dissolution of earthly difference under the universality
of a God. To the atheist, any proposition that implies such coincidence is a 'semantic fallacy'. And so,
it is only for the atheist that there exists the possibility of Semantics proper.]
[In other words, we need a God that is partial. Where grotesque violations cannot coincide with
authentic acts of emancipation, all as attempts to enact God's Will. Such a God however is
'(symbolically) castrated', and must stand impotent in the face of subversion of Will. <b>Therefore, for
Hegel, what dies on the cross is God, as the omnipotent, obscure gatekeeper. What's redeemed
hereby is the possibility of semantic constitution.</b>]
</p>
<p>It seems reasonable to infer that, such a proposition as the 2nd Law of Thermodynamics is recognised
in its universality only insofar as it is realised with adequate 'semantic supplements', vis-à-vis
reference to some appropriate theory (like Crystal Field Theory in the representative example). In
itself, it cannot justify the legitimacy of certain phenomena against the simultaneous illegitimacy
of all others. (Perhaps what we need here is a Thermodynamics based off Counterfactuals - one
which is responsive to semantic queries - “Why this and none other?”. Interesting work along these
lines has been in progress - <b>Constructor Theory/Chiara Marletto</b>).</p>
<p>As such, the impotence of theory is the impotence of the universal that cannot fully assert itself, since
it would thereby risk a collapse to the logic of coincidence. It is apparently the case that theory must
be artificially resuscitated, and that truth be synthesised in a purely supplementary way, since it
cannot be located within the categories of theory. In the context of the typical problem of describing
molecularity in chemistry, one maintains that the first-principles of Physics are in principle true -
however in practice, the computation is unwieldy, and therefore we must introduce a series of
literate approximations (ansatz) such that molecules may be simulated and my first principles are
still 'formally' operative. It seems Theory is overwhelmingly mute, and must be made to talk.
(When asked to comment on Bob Dylan's Nobel Prize in Literature (2016), Leonard Cohen remarks,
“To me [the award] is like pinning a medal on Mount Everest for being the highest mountain...”
Isn't the actual fate of theory here similarly superfluous? Far from being a privileged point of access
to obscure truths, Theory can only articulate, peripherally punctuate that what's already been
uttered and is trivially known. <b>As such, this false universality of theory must be abandoned</b>.)
Let us try to establish the same argument in a totally different field '' Economics. Now, it has to be
kept in mind that the truth value of a typical 'theory' in Economics functions quite differently than
what's seen for the Natural Sciences '' it is not the case that an economic theory (let's say Marx's
'Labour Theory of Value' in the context of 'Theory of Diminishing Return') is objectively falsified or
established. It is always against the background of certain silent presuppositions that theories of the
sort and the necessary categories within the theory come to be determined as actual or erroneous.
So, the question rather than being something like - “Is Labour Theory of Value and the
accompanying justification for 'Diminishing Retur' correct?” - becomes instead - “Against the
particular determination prescribed by the theory, do we obtain a substantial definition for each of
the categories - 'Labour', 'Value', 'Diminishing Return' etc. - that is realistic and insightful?”
(The motivation being, much like that for the theoretical chemist, to articulate precisely upon
transcendental interrogation, the categories immanent to theory.)</p>
<p>
At the same time here, one ought to resist the temptation to immediately disavow the theoretical
rigour of such arguments claiming they are ballpark 'models' instead, developed without the kind of
immaculate deductive rationale typical to theory, relying rather on ad-hoc approximations to
reproduce empirical results. This would be the Nominalist position. (As opposed to Realist)
Now for the Realist, it is imperative to identify the inevitable multiplicity of categories, the
proliferation of offspring terms, as a necessary response to the fundamental impossibility of the
singular category, and thus immanent to theory itself. I shall elaborate on this interpretation in a
later section.
The point essentially stands - it is not that such theories can be conceived of as wholly constituted,
determined independent of approach; but instead, as requiring they be equipped with necessary
supplements, to be realised. And in doing so, one discovers that such a theory is incomplete in a
characteristic way, and its rationale requires we develop supplementary categories in order to
compensate for the 'characteristic impotence of the singular', that which is the point of induction.
(<b>Erasure of Categories</b>)
</p>
<p>
<b>It is indeed my purpose to posit 'Erasure of Categories' as the central logical operation in the
phenomenon of 'Emergence'.</b>
</p>
<p>To return to the more familiar domain of Chemistry, and to put it in more concrete terms, it appears
that the molecular level simply doesn't follow naturally from the point of induction at the atomic
level. That is, if my singular category is to be something like 'Ground State Electronic Wavefunction'
or 'Charge Density' (atomic features), the corresponding computation that generates molecular
features from this given information seems to run on forever. And it is under these circumstances
that we are forced to intervene, improvise, approximate, prescribe (empirical) Rules, and so on, to
establish the molecular domain. However, what is somewhat superficial in all this, is that we already
interject from a point of chemical literacy to reduce computational complexity. So, what is
essentially at work here is the same (allegedly) ill-defined intuitive basis, but perhaps in a more
cultured way. (For instance, such was the position of Linus Pauling on the question of the chemical
bond - there is no such singular, necessary & sufficient condition for the existence of a bond - it is
ultimately up to the 'finite judgement' of the chemist, who must, upon use of various indices and
comparisons, arrive at a conclusion.)</p>
<p>
Now, it is required here that we take a further step, and try to conceptualise these prescriptive
supplements as nonetheless immanent to theory itself. That is, it is in performing or furthering the
particular line of logical inquiry opened up by the singular category, that we stumble across the
supplementary offspring. In other words, one has to interpret <b>the apparent multiplicity of
categories</b> as some kind of a 'symptom' of a particular <b>impasse articulated by the singular</b>.
That is, emergence of molecularity from atoms involves the development of a certain impasse, such
that no finite (computable) mechanism may be devised for discrete atoms to transform to
corresponding molecules, following which there is a some kind of “coarse-graining”, as a
consequence of which molecules as such come to be.
</p>
<p>What's crucial and deliberate here, is the situation of the computational perspective as in the third-
person (in this case, atom). It is literally as if atoms cannot figure out a finite mechanistic route to
combine lawfully into molecules, and must generate immanently, in and of themselves,
supplementary categories/approximates for proper mechanistic emergence of molecules. This is the
perspective of Hetero-phenomenology, as propounded by Daniel Dennett (Phenomenology being
the study of first-person/qualia-based experience).
The following categorisation is proposed - '<b>Descriptive Approximation</b>' & '<b>Prescriptive
Approximation</b>'; the former as if in agreement with the scheme of approximations described by the
heterophenomenological 'third perspective' (and thereby immanent to theory), the latter not quite
so. There are crucial consequences and implications to this categorical wager, that I'll take up in
detail in subsequent sections.</p>
<p>
(Briefly, [descriptive] 'approximation' which was hitherto purely a matter of epistemological concern
- 'Such is how we may know of them.' , is now identified within ontology , as an existential category
- 'Such is the way they are.')
</p>
<div align="left">
<h3>Some Reflections on Epistemology</h3>
</div>
<p>
The cognitivist crusade against the then-hegemonic behaviourist paradigm, inaugurated by the likes
of Chomsky and Miller, proposes that instead of a blank slate ('tabula rasa') that our environs
merely fill in, plastic and amoeboid as behaviourists suppose, we are in fact predisposed a certain
way, owing to rich innate structures and schematisms which allow for some functionalities whilst
eliminating others. There is of course a parallel here with Darwinian thought (as opposed to Lamarck
who is closer to Behaviourism).
Epistemology as such is concerned with such relations of predisposition as exhibited by different
fields or different modes of inquiry ' the idea being, the accessibility of knowledge is predicated
upon what mode of inquiry ('episteme') I am operating within. Let us consider a typical case of study
- Depression. For a biochemist, this obviously translates to sub-normal serotonin and dopamine
levels in blood. For a psychoanalyst, it may have a different translation. This is not to say that the
two are necessarily in disagreement, or that they are even talking about the same thing. The very
field of biochemistry allows for a certain articulation, a certain -representation- of a'real' (in this case
Depression), whereas Psychoanalysis has its own 'representation'. Thereby a distinction seems to
develop between the 'real' of an entity, and its possible 'representation(s)'. This is of course in strict
correspondence with Kantian Transcendental Philosophy, with its characteristic distinction of
'Noumenon' and 'Phenomenon'. And of course, it doesn't come as a surprise that Chomsky defines
the field of Cognitive Science as 'Naturalised Epistemology', wherein our cognitive apparatus entails
a particular 'representation'.
</p>
<p>
<b>What follows is an attempt to characterise the episteme of Natural Science</b>. Let us consider a
system of 3 static charges - 2 negative charges lying at the 2 poles of a line segment, and the 3rd
(positive) charge at the mid-point; charges are of appropriate magnitude so as to be in mechanical
equilibrium in such an orientation. It's interesting to observe that this system is in 'stable
equilibrium' with respect to displacement (of the positive charge) along Y axis (Y axis being
perpendicular to the line segment), whilst in 'unstable equilibrium' with respect to displacement
along X axis. This implies that an arbitrary initial constellation of charges couldn't have resulted in the
ordered assembly described. (There are restrictions on the initial configuration of the 3 charges.) For
a system of masses, the situation is dire - for the electrostatic case, attractive and repulsive forces
are both operative, for masses there can only ever be attractive force. And for more complex
assemblies such as planetary systems, the problem is unfathomably worse. It appears that there has
been a trade-off - natural phenomenon appears to us as wholly transparent, ordered and
mechanistic when approached in its equilibrium state; the slightest of perturbations to this stasis
completely thwarts all attempts at systematic inquiry.<b> The question of origin ('What is the initial
configuration of the charges/masses?') is blurred out of epistemic horizon</b>. Along similar lines one
could even consider the question of origin of the universe. It is for deeply structural, epistemological
reasons that scientific inquiry is confronted with unprecedented resistance in taking up such
problems.
</p>
<p>
And it is not too far a stretch to include in the same category, the problem of Climate Change, which
remains a mystery beyond scientific predictability. Insofar as 'Nature' is conceptualised, or rather
reified into this holistic, wholly constituted, self-regulatory and infinitely stable mechanism, so-called
“unnatural” deviations from this norm (in and as Climate Change) cannot be articulated within the
same categories. No wonder Manabe & Hasselmann were awarded the Nobel Prize in Physics (2021)
for their noteworthy contributions to this topic.
One is perhaps required to drop altogether this artificial picture of Nature (as some sort of a
secularised God), as philosopher Slavoj Zizek opines.
</p>
<p>
I shall hereby develop a line of thought involving Gödel's Incompleteness Theorem, and this will serve
as the conclusion. I rely on fundamentals described in “Incompleteness & Other” in the previous
issue, including its consequences for Subjectivity. The Incompleteness Theorem profoundly echoes
the themes of Kantian Transcendental Subjectivity, as was clearly acknowledged by Gödel himself. As
such, it asserts the destitution of subjective rationale as irreparably cut off from the 'Real' - it is only
from an external vantage point that the consistency of the mathematical system I am operating
within can be established; or in Kantian terms, I am prohibited to assume the authenticity of my
subjective order as an absolute feature of the 'Real'. Consider an elementary instance of subjectivity
at work - some people like to read a book indoors while it rains outside. As such, within their
subjective sphere there is such a semantic association at work. Now, it is certainly not the case that
should we venture to deconstruct the anatomy of rainfall, we'll stumble across in its depths, the
semantics of pluviophilia. The 'act of reading' is in no way contained in 'rain' - this for Kant is the
purely 'synthetic' (as opposed to 'analytic') dimension of transcendental subjectivity. In other words,
meaning can only ever be invented than discovered.
</p>
<p>I now refer to Penrose's theory of Orchestrated Objective Reduction on the origin of Consciousness.
I am, however, interested solely in the argumentation and consequences besides this may be
neglected here. For Penrose, both Quantum Mechanics (Wave Function Collapse) and (human)
Consciousness stage a unique by-passing of the conditions of Incompleteness Theorem - that is, the
Incompleteness Theorem does not talk of the kind of logic encompassed by the two.
Notwithstanding, he goes a step further and concludes, to spectacular controversy, that the two
(Quantum Mechanics & Consciousness) could thereby describe each other - an instance of
abductive reasoning. (Some of the criticism points to the particular biological organelle he
nominates as the seat for Quantum Mechanical phenomena in the human brain - microtubules.
These are aspects I shall not be considering here.)
In a way, Penrose's argument boils down to the idea that two systems can be 'similarly irrational', or
'similarly undecidable', in which case, by-passing the Kantian criterion of Transcendental Opacity
(Meaning as immanent reflection; invention as opposed to discovery), we have inter-subjective or
inter-epistemic correspondence. [Crucially, this leads us to the Freudian category and accompanying
ontology of the 'Symptom', that I'll take up in a later issue.]
(It must also be noted that there have been attempts to appropriate Penrose's (& Stuart Hameroff's)
thesis as establishing, within Quantum Physics, the familiar theme of 'Decentred Subject', which goes
back as far as the Freudian Unconscious.)
In response to this we are required to reconsider our very understanding of what constitutes theory.
It is of course generally expected of theory to impose some sort of rational 'arborescence' on the
primordial multiplicity of categories, rendering it well-structured. However, what cannot be
disregarded is the radical potential within theory, to mobilise contradictions, impossibilities,
impasses, so as to include within its explanatory scope similarly elusive, 'irrational' phenomena.
Recall the inspiring exchange between Wittgenstein and Turing on the Liar's Paradox.
What must be consciously secured against the hegemony of false universals is the authentic,
'evental' status of an impossible problem. (Herein lies the prospect of 'Emergence'.)
In other words, 'the hard problem of consciousness' isn't merely a problem to be solved (if it at all
can be), but dialectically, it is also a medium to reflect upon the foundations of existing theory. Much
has been said of the (alleged) falseness of 'the hard problem'; it's time we observe our foundations.</p>
<br>
<br>
<br>
<p>
(The final section has been reserved for a chapter in the next issue which will continue this query.)
</p>
</div>
</body>
</html>