Skip to content

Commit

Permalink
update
Browse files Browse the repository at this point in the history
  • Loading branch information
aiceflower committed Jul 21, 2023
1 parent bf721f3 commit 621b315
Show file tree
Hide file tree
Showing 7 changed files with 35 additions and 56 deletions.
16 changes: 8 additions & 8 deletions docs/deployment/deploy-quick.md
Original file line number Diff line number Diff line change
Expand Up @@ -507,10 +507,10 @@ wds.linkis.admin.password= #Password
sh bin/linkis-cli -submitUser hadoop -engineType shell-1 -codeType shell -code "whoami"
#hive engine tasks
sh bin/linkis-cli -submitUser hadoop -engineType hive-2.3.3 -codeType hql -code "show tables"
sh bin/linkis-cli -submitUser hadoop -engineType hive-3.1.3 -codeType hql -code "show tables"
#spark engine tasks
sh bin/linkis-cli -submitUser hadoop -engineType spark-2.4.3 -codeType sql -code "show tables"
sh bin/linkis-cli -submitUser hadoop -engineType spark-3.2.1 -codeType sql -code "show tables"
#python engine task
sh bin/linkis-cli -submitUser hadoop -engineType python-python2 -codeType python -code 'print("hello, world!")'
Expand Down Expand Up @@ -551,9 +551,9 @@ $ tree linkis-package/lib/linkis-engineconn-plugins/ -L 3
linkis-package/lib/linkis-engineconn-plugins/
├── hive
│ ├── dist
│ │ └── 2.3.3 #version is 2.3.3 engineType is hive-2.3.3
│ │ └── 3.1.3 #version is 3.1.3 engineType is hive-3.1.3
│ └── plugin
│ └── 2.3.3
│ └── 3.1.3
├── python
│ ├── dist
│ │ └── python2
Expand All @@ -566,9 +566,9 @@ linkis-package/lib/linkis-engineconn-plugins/
│ └── 1
└── spark
├── dist
│ └── 2.4.3
│ └── 3.2.1
└── plugin
└── 2.4.3
└── 3.2.1
````
#### Method 2: View the database table of linkis
Expand All @@ -595,13 +595,13 @@ Insert yarn data information
INSERT INTO `linkis_cg_rm_external_resource_provider`
(`resource_type`, `name`, `labels`, `config`) VALUES
('Yarn', 'sit', NULL,
'{\r\n"rmWebAddress": "http://xx.xx.xx.xx:8088",\r\n"hadoopVersion": "2.7.2",\r\n"authorEnable":false, \r\n"user":"hadoop",\r\n"pwd":"123456"\r\n}'
'{\r\n"rmWebAddress": "http://xx.xx.xx.xx:8088",\r\n"hadoopVersion": "3.3.4",\r\n"authorEnable":false, \r\n"user":"hadoop",\r\n"pwd":"123456"\r\n}'
);
config field properties
"rmWebAddress": "http://xx.xx.xx.xx:8088", #need to bring http and port
"hadoopVersion": "2.7.2",
"hadoopVersion": "3.3.4",
"authorEnable":true, //Whether authentication is required You can verify the username and password by visiting http://xx.xx.xx.xx:8088 in the browser
"user":"user",//username
"pwd":"pwd"//Password
Expand Down
6 changes: 1 addition & 5 deletions docs/engine-usage/jdbc.md
Original file line number Diff line number Diff line change
Expand Up @@ -202,11 +202,7 @@ Note: After modifying the configuration under the `IDE` tag, you need to specify
sh ./bin/linkis-cli -creator IDE \
-engineType jdbc-4 -codeType jdbc \
-code "show tables" \
-submitUser hadoop -proxyUser hadoop \
-runtimeMap wds.linkis.jdbc.connect.url=jdbc:mysql://127.0.0.1:3306 \
-runtimeMap wds.linkis.jdbc.driver=com.mysql.jdbc.Driver \
-runtimeMap wds.linkis.jdbc.username=root \
-runtimeMap wds.linkis.jdbc.password=123456 \
-submitUser hadoop -proxyUser hadoop
```

#### 4.2.2 Task interface configuration
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -43,23 +43,20 @@ Linkis 1.4.0 版本,主要增加了如下功能:将 hadoop、spark、hive
## 新特性
- \[EC][LINKIS-4263](https://github.com/apache/linkis/pull/4263) 将 Hadoop、Spark、Hive 默认版本升级为3.x
- \[EC-Hive][LINKIS-4359](https://github.com/apache/linkis/pull/4359) Hive EC 支持并发任务
- \[COMMON][LINKIS-4424](https://github.com/apache/linkis/pull/4424) linkis-storage 支持 OSS 文件系统
- \[COMMON][LINKIS-4435](https://github.com/apache/linkis/pull/4435) linkis-storage 支持 S3 文件系统
- \[EC-Impala][LINKIS-4458](https://github.com/apache/linkis/pull/4458) 增加 Impala EC 插件支持
- \[ECM][LINKIS-4452](https://github.com/apache/linkis/pull/4452) ECM 重启时不 kill EC
- \[EC][LINKIS-4460](https://github.com/apache/linkis/pull/4460) Linkis 支持多集群
- \[COMMON][LINKIS-4524](https://github.com/apache/linkis/pull/4524)支持postgresql数据库
- \[DMS][LINKIS-4486](https://github.com/apache/linkis/pull/4486) 数据源模支持 Tidb 数据源
- \[DMS][LINKIS-4496](https://github.com/apache/linkis/pull/4496) 数据源模支持 Starrocks 数据源
- \[DMS][LINKIS-4513](https://github.com/apache/linkis/pull/4513) 数据源模支持 Gaussdb 数据源
- \[DMS][LINKIS-](https://github.com/apache/linkis/pull/4581) 数据源模支持 OceanBase 数据源
- \[EC-Spark][LINKIS-4568](https://github.com/apache/linkis/pull/4568) Spark JDBC支持dm和kingbase数据库
- \[EC-Spark][LINKIS-4539](https://github.com/apache/linkis/pull/4539) Spark etl支持excel
- \[EC-Spark][LINKIS-4534](https://github.com/apache/linkis/pull/4534) Spark etl支持redis
- \[EC-Spark][LINKIS-4564](https://github.com/apache/linkis/pull/4564) Spark etl支持RocketMQ
- \[EC-Spark][LINKIS-4560](https://github.com/apache/linkis/pull/4560) Spark etl支持mongo and es
- \[EC-Spark][LINKIS-4569](https://github.com/apache/linkis/pull/4569) Spark etl支持solr
- \[EC-Spark][LINKIS-4563](https://github.com/apache/linkis/pull/4563) Spark etl支持kafka
- \[COMMON][LINKIS-4524](https://github.com/apache/linkis/pull/4524)支持 postgresql 数据库
- \[DMS][LINKIS-4486](https://github.com/apache/linkis/pull/4486) 支持 Tidb 数据源
- \[DMS][LINKIS-4496](https://github.com/apache/linkis/pull/4496) 支持 Starrocks 数据源
- \[DMS][LINKIS-4513](https://github.com/apache/linkis/pull/4513) 支持 Gaussdb 数据源
- \[DMS][LINKIS-](https://github.com/apache/linkis/pull/4581) 支持 OceanBase 数据源
- \[EC-Spark][LINKIS-4568](https://github.com/apache/linkis/pull/4568) Spark JDBC支持 dm 和 kingbase 数据库
- \[EC-Spark][LINKIS-4539](https://github.com/apache/linkis/pull/4539) Spark etl 支持 excel
- \[EC-Spark][LINKIS-4534](https://github.com/apache/linkis/pull/4534) Spark etl 支持 redis
- \[EC-Spark][LINKIS-4564](https://github.com/apache/linkis/pull/4564) Spark etl 支持 RocketMQ
- \[EC-Spark][LINKIS-4560](https://github.com/apache/linkis/pull/4560) Spark etl 支持 mongo and es
- \[EC-Spark][LINKIS-4569](https://github.com/apache/linkis/pull/4569) Spark etl 支持 solr
- \[EC-Spark][LINKIS-4563](https://github.com/apache/linkis/pull/4563) Spark etl 支持 kafka
- \[EC-Spark][LINKIS-4538](https://github.com/apache/linkis/pull/4538) Spark etl 支持数据湖


Expand All @@ -76,7 +73,7 @@ Linkis 1.4.0 版本,主要增加了如下功能:将 hadoop、spark、hive
- \[EC-Trino][LINKIS-4526](https://github.com/apache/linkis/pull/4526) Trino EC 代码转换为 Java
- \[EC-Presto][LINKIS-4514](https://github.com/apache/linkis/pull/4514) Presto EC 代码转换为 Java
- \[EC-Elasticsearch][LINKIS-4531](https://github.com/apache/linkis/pull/4531) Elasticsearch EC 代码转换为 Java
- \[COMMON][LINKIS-4475](https://github.com/apache/linkis/pull/4475) 在k8s部署中使用最新的mysql DDL
- \[COMMON][LINKIS-4475](https://github.com/apache/linkis/pull/4475) 在 k8s 部署中使用最新的 mysql DDL
- \[EC-Flink][LINKIS-4556](https://github.com/apache/linkis/pull/4556) Flink EC 增加任务拦截器
- \[GATEWAY][LINKIS-4548](https://github.com/apache/linkis/pull/4548) 用户注销时清除所有后端缓存
- \[COMMON][LINKIS-4554](https://github.com/apache/linkis/pull/4554) 在Linkis中增加MDC日志格式,用于跟踪JobID
Expand All @@ -89,16 +86,14 @@ Linkis 1.4.0 版本,主要增加了如下功能:将 hadoop、spark、hive
## 修复功能
- \[EC-Hive][LINKIS-4246](https://github.com/apache/linkis/pull/4246) Hive 引擎版本号支持连字符,如hive3.1.2-cdh5.12.0
- \[COMMON][LINKIS-4438](https://github.com/apache/linkis/pull/4438) 修正了nohup启动错误
- \[EC][LINKIS-4429](https://github.com/apache/linkis/pull/4429)修复 CPU 平均负载计算bug
- \[EC][LINKIS-4429](https://github.com/apache/linkis/pull/4429) 修复 CPU 平均负载计算bug
- \[PE][LINKIS-4457](https://github.com/apache/linkis/pull/4457) 修复由管理控制台配置的参数验证问题
- \[DMS][LINKIS-4500](https://github.com/apache/linkis/pull/4500) 修复客户端与数据源之间类型转换失败问题
- \[COMMON][LINKIS-4480](https://github.com/apache/linkis/pull/4480) 修复了使用 jdk17 构建默认配置文件的问题
- \[CG][LINKIS-4663](https://github.com/apache/linkis/pull/4663) 修复引擎复用可能会抛出 NPE 的问题
- \[LM][LINKIS-4652](https://github.com/apache/linkis/pull/4652) 修复了创建引擎节点抛出 NPE 的问题
- \[][LINKIS-](https://github.com/apache/linkis/pull/)
- \[][LINKIS-](https://github.com/apache/linkis/pull/)


## 致谢
Apache Linkis 1.3.2 的发布离不开 Linkis 社区的贡献者,感谢所有的社区贡献者,包括但不仅限于以下 Contributors(排名不发先后):
Apache Linkis 1.4.0 的发布离不开 Linkis 社区的贡献者,感谢所有的社区贡献者,包括但不仅限于以下 Contributors(排名不发先后):
casionone,MrFengqin,zhangwejun,Zhao,ahaoyao,duhanmin,guoshupei,shixiutao,CharlieYan24,peacewong,GuoPhilipse,aiceflower,waynecookie,jacktao007,chenghuichen,ws00428637,ChengJie1053,dependabot,jackxu2011,sjgllgh,rarexixi,pjfanning,v-kkhuang,binbinCheng,stdnt-xiao,mayinrain。
Original file line number Diff line number Diff line change
Expand Up @@ -496,10 +496,10 @@ wds.linkis.admin.password= #密码
sh bin/linkis-cli -submitUser hadoop -engineType shell-1 -codeType shell -code "whoami"
#hive引擎任务
sh bin/linkis-cli -submitUser hadoop -engineType hive-2.3.3 -codeType hql -code "show tables"
sh bin/linkis-cli -submitUser hadoop -engineType hive-3.1.3 -codeType hql -code "show tables"
#spark引擎任务
sh bin/linkis-cli -submitUser hadoop -engineType spark-2.4.3 -codeType sql -code "show tables"
sh bin/linkis-cli -submitUser hadoop -engineType spark-3.2.1 -codeType sql -code "show tables"
#python引擎任务
sh bin/linkis-cli -submitUser hadoop -engineType python-python2 -codeType python -code 'print("hello, world!")'
Expand Down Expand Up @@ -540,9 +540,9 @@ $ tree linkis-package/lib/linkis-engineconn-plugins/ -L 3
linkis-package/lib/linkis-engineconn-plugins/
├── hive
│   ├── dist
│   │   └── 2.3.3 #版本为2.3.3 engineType 为hive-2.3.3
│   │   └── 3.1.3 #版本为 3.1.3 engineType 为hive-3.1.3
│   └── plugin
│   └── 2.3.3
│   └── 3.1.3
├── python
│   ├── dist
│   │   └── python2
Expand All @@ -555,9 +555,9 @@ linkis-package/lib/linkis-engineconn-plugins/
│   └── 1
└── spark
├── dist
│   └── 2.4.3
│   └── 3.2.1
└── plugin
└── 2.4.3
└── 3.2.1
```

#### 方式2: 查看linkis的数据库表
Expand Down Expand Up @@ -585,13 +585,13 @@ select * from linkis_cg_engine_conn_plugin_bml_resources
INSERT INTO `linkis_cg_rm_external_resource_provider`
(`resource_type`, `name`, `labels`, `config`) VALUES
('Yarn', 'sit', NULL,
'{\r\n"rmWebAddress": "http://xx.xx.xx.xx:8088",\r\n"hadoopVersion": "2.7.2",\r\n"authorEnable":false,\r\n"user":"hadoop",\r\n"pwd":"123456"\r\n}'
'{\r\n"rmWebAddress": "http://xx.xx.xx.xx:8088",\r\n"hadoopVersion": "3.3.4",\r\n"authorEnable":false,\r\n"user":"hadoop",\r\n"pwd":"123456"\r\n}'
);
config字段属性
"rmWebAddress": "http://xx.xx.xx.xx:8088", #需要带上http以及端口
"hadoopVersion": "2.7.2",
"hadoopVersion": "3.3.4",
"authorEnable":true, //是否需要认证 可以在浏览器中通过访问http://xx.xx.xx.xx:8088验证用户名和密码
"user":"user",//用户名
"pwd":"pwd"//密码
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -202,11 +202,7 @@ labels.put(LabelKeyConstant.CODE_TYPE_KEY, "jdbc"); // required codeType
sh ./bin/linkis-cli -creator IDE \
-engineType jdbc-4 -codeType jdbc \
-code "show tables" \
-submitUser hadoop -proxyUser hadoop \
-runtimeMap wds.linkis.jdbc.connect.url=jdbc:mysql://127.0.0.1:3306 \
-runtimeMap wds.linkis.jdbc.driver=com.mysql.jdbc.Driver \
-runtimeMap wds.linkis.jdbc.username=root \
-runtimeMap wds.linkis.jdbc.password=123456 \
-submitUser hadoop -proxyUser hadoop
```

#### 4.2.2 任务接口配置
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -202,11 +202,7 @@ labels.put(LabelKeyConstant.CODE_TYPE_KEY, "jdbc"); // required codeType
sh ./bin/linkis-cli -creator IDE \
-engineType jdbc-4 -codeType jdbc \
-code "show tables" \
-submitUser hadoop -proxyUser hadoop \
-runtimeMap wds.linkis.jdbc.connect.url=jdbc:mysql://127.0.0.1:3306 \
-runtimeMap wds.linkis.jdbc.driver=com.mysql.jdbc.Driver \
-runtimeMap wds.linkis.jdbc.username=root \
-runtimeMap wds.linkis.jdbc.password=123456 \
-submitUser hadoop -proxyUser hadoop
```

#### 4.2.2 任务接口配置
Expand Down
6 changes: 1 addition & 5 deletions versioned_docs/version-1.3.2/engine-usage/jdbc.md
Original file line number Diff line number Diff line change
Expand Up @@ -202,11 +202,7 @@ Note: After modifying the configuration under the `IDE` tag, you need to specify
sh ./bin/linkis-cli -creator IDE \
-engineType jdbc-4 -codeType jdbc \
-code "show tables" \
-submitUser hadoop -proxyUser hadoop \
-runtimeMap wds.linkis.jdbc.connect.url=jdbc:mysql://127.0.0.1:3306 \
-runtimeMap wds.linkis.jdbc.driver=com.mysql.jdbc.Driver \
-runtimeMap wds.linkis.jdbc.username=root \
-runtimeMap wds.linkis.jdbc.password=123456 \
-submitUser hadoop -proxyUser hadoop
```

#### 4.2.2 Task interface configuration
Expand Down

0 comments on commit 621b315

Please sign in to comment.