Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix the problem of Sonar code scanning #4369

Open
wants to merge 14 commits into
base: master
Choose a base branch
from
Open
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
4 changes: 2 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -85,8 +85,8 @@ Since the first release of Linkis in 2019, it has accumulated more than **700**

| **Engine Name** | **Suppor Component Version<br/>(Default Dependent Version)** | **Linkis Version Requirements** | **Included in Release Package<br/> By Default** | **Description** |
|:---- |:---- |:---- |:---- |:---- |
|Spark|Apache 2.0.0~2.4.7, <br/>CDH >= 5.4.0, <br/>(default Apache Spark 2.4.3)|\>=1.0.3|Yes|Spark EngineConn, supports SQL , Scala, Pyspark and R code|
|Hive|Apache >= 1.0.0, <br/>CDH >= 5.4.0, <br/>(default Apache Hive 2.3.3)|\>=1.0.3|Yes |Hive EngineConn, supports HiveQL code|
|Spark|Apache >= 2.0.0, <br/>CDH >= 5.4.0, <br/>(default Apache Spark 3.2.1)|\>=1.0.3|Yes|Spark EngineConn, supports SQL , Scala, Pyspark and R code|
|Hive|Apache >= 1.0.0, <br/>CDH >= 5.4.0, <br/>(default Apache Hive 3.1.3)|\>=1.0.3|Yes |Hive EngineConn, supports HiveQL code|
|Python|Python >= 2.6, <br/>(default Python2*)|\>=1.0.3|Yes |Python EngineConn, supports python code|
|Shell|Bash >= 2.0|\>=1.0.3|Yes|Shell EngineConn, supports Bash shell code|
|JDBC|MySQL >= 5.0, Hive >=1.2.1, <br/>(default Hive-jdbc 2.3.4)|\>=1.0.3|No|JDBC EngineConn, already supports MySQL and HiveQL, can be extended quickly Support other engines with JDBC Driver package, such as Oracle|
Expand Down
4 changes: 2 additions & 2 deletions README_CN.md
Original file line number Diff line number Diff line change
Expand Up @@ -82,8 +82,8 @@ Linkis 自 2019 年开源发布以来,已累计积累了 700 多家试验企

| **引擎名** | **支持底层组件版本 <br/>(默认依赖版本)** | **Linkis 版本要求** | **是否默认包含在发布包中** | **说明** |
|:---- |:---- |:---- |:---- |:---- |
|Spark|Apache 2.0.0~2.4.7, <br/>CDH >= 5.4.0, <br/>(默认 Apache Spark 2.4.3)|\>=1.0.3|是|Spark EngineConn, 支持 SQL, Scala, Pyspark 和 R 代码|
|Hive|Apache >= 1.0.0, <br/>CDH >= 5.4.0, <br/>(默认 Apache Hive 2.3.3)|\>=1.0.3|是|Hive EngineConn, 支持 HiveQL 代码|
|Spark|Apache >= 2.0.0, <br/>CDH >= 5.4.0, <br/>(默认 Apache Spark 3.2.1)|\>=1.0.3|是|Spark EngineConn, 支持 SQL, Scala, Pyspark 和 R 代码|
|Hive|Apache >= 1.0.0, <br/>CDH >= 5.4.0, <br/>(默认 Apache Hive 3.1.3)|\>=1.0.3|是|Hive EngineConn, 支持 HiveQL 代码|
|Python|Python >= 2.6, <br/>(默认 Python2*)|\>=1.0.3|是|Python EngineConn, 支持 python 代码|
|Shell|Bash >= 2.0|\>=1.0.3|是|Shell EngineConn, 支持 Bash shell 代码|
|JDBC|MySQL >= 5.0, Hive >=1.2.1, <br/>(默认 Hive-jdbc 2.3.4)|\>=1.0.3|否|JDBC EngineConn, 已支持 MySQL 和 HiveQL,可快速扩展支持其他有 JDBC Driver 包的引擎, 如 Oracle|
Expand Down
4 changes: 2 additions & 2 deletions docs/configuration/linkis-computation-governance-common.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,8 @@
| Module Name (Service Name) | Parameter Name | Default Value | Description |
| -------- | -------- | ----- |----- |
|linkis-computation-governance-common|wds.linkis.rm| | wds.linkis.rm |
|linkis-computation-governance-common|wds.linkis.spark.engine.version|2.4.3 |spark.engine.version|
|linkis-computation-governance-common|wds.linkis.hive.engine.version| 1.2.1 |hive.engine.version|
|linkis-computation-governance-common|wds.linkis.spark.engine.version|3.2.1 |spark.engine.version|
|linkis-computation-governance-common|wds.linkis.hive.engine.version| 3.1.3 |hive.engine.version|
|linkis-computation-governance-common|wds.linkis.python.engine.version|python2 | python.engine.version |
|linkis-computation-governance-common|wds.linkis.python.code_parser.enabled| false |python.code_parser.enabled|
|linkis-computation-governance-common|wds.linkis.scala.code_parser.enabled| false | scala.code_parser.enabled |
Expand Down
2 changes: 1 addition & 1 deletion docs/configuration/linkis-manager-common.md
Original file line number Diff line number Diff line change
Expand Up @@ -4,7 +4,7 @@
| Module Name (Service Name) | Parameter Name | Default Value | Description |Used|
| -------- | -------- | ----- |----- | ----- |
|linkis-manager-common|wds.linkis.default.engine.type |spark|engine.type|
|linkis-manager-common|wds.linkis.default.engine.version |2.4.3|engine.version|
|linkis-manager-common|wds.linkis.default.engine.version |3.2.1|engine.version|
|linkis-manager-common|wds.linkis.manager.admin|hadoop|manager.admin|
|linkis-manager-common|wds.linkis.rm.application.name|ResourceManager|rm.application.name|
|linkis-manager-common|wds.linkis.rm.wait.event.time.out| 1000 * 60 * 12L |event.time.out|
Expand Down
2 changes: 1 addition & 1 deletion docs/configuration/linkis-udf.md
Original file line number Diff line number Diff line change
Expand Up @@ -3,7 +3,7 @@

| Module Name (Service Name) | Parameter Name | Default Value | Description |Used|
| -------- | -------- | ----- |----- | ----- |
|linkis-udf|wds.linkis.udf.hive.exec.path |/appcom/Install/DataWorkCloudInstall/linkis-linkis-Udf-0.0.3-SNAPSHOT/lib/hive-exec-1.2.1.jar|udf.hive.exec.path|
|linkis-udf|wds.linkis.udf.hive.exec.path |/appcom/Install/DataWorkCloudInstall/linkis-linkis-Udf-0.0.3-SNAPSHOT/lib/hive-exec-3.1.3.jar|udf.hive.exec.path|
|linkis-udf|wds.linkis.udf.tmp.path|/tmp/udf/|udf.tmp.path|
|linkis-udf|wds.linkis.udf.share.path|/mnt/bdap/udf/|udf.share.path|
|linkis-udf|wds.linkis.udf.share.proxy.user| hadoop|udf.share.proxy.user|
Expand Down
2 changes: 1 addition & 1 deletion docs/errorcode/linkis-configuration-errorcode.md
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@
|linkis-configuration |14100|CategoryName cannot be included '-'(类别名称不能包含 '-')|CANNOT_BE_INCLUDED|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|Creator is null, cannot be added(创建者为空,无法添加)|CREATOR_IS_NULL_CANNOT_BE_ADDED|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|Engine type is null, cannot be added(引擎类型为空,无法添加)|ENGINE_TYPE_IS_NULL|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|The saved engine type parameter is incorrect, please send it in a fixed format, such as spark-2.4.3(保存的引擎类型参数有误,请按照固定格式传送,例如spark-2.4.3)|INCORRECT_FIXED_SUCH|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|The saved engine type parameter is incorrect, please send it in a fixed format, such as spark-3.2.1(保存的引擎类型参数有误,请按照固定格式传送,例如spark-3.2.1)|INCORRECT_FIXED_SUCH|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|Incomplete request parameters, please reconfirm(请求参数不完整,请重新确认)|INCOMPLETE_RECONFIRM|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|Only admin can modify category(只有管理员才能修改目录)|ONLY_ADMIN_CAN_MODIFY|LinkisConfigurationErrorCodeSummary|
|linkis-configuration |14100|The label parameter is empty(标签参数为空)|THE_LABEL_PARAMETER_IS_EMPTY|LinkisConfigurationErrorCodeSummary|
Expand Down
2 changes: 1 addition & 1 deletion docs/trino-usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -46,7 +46,7 @@ Linkis1.X是通过标签来进行的,所以需要在我们数据库中插入

```
linkis_ps_configuration_config_key: 插入引擎的配置参数的key和默认values
linkis_cg_manager_label:插入引擎label如:hive-2.3.3
linkis_cg_manager_label:插入引擎label如:hive-3.1.3
linkis_ps_configuration_category: 插入引擎的目录关联关系
linkis_ps_configuration_config_value: 插入引擎需要展示的配置
linkis_ps_configuration_key_engine_relation:配置项和引擎的关联关系
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -350,7 +350,7 @@ public long convertTo(long d, ByteUnit u) {
}
}

public double toBytes(long d) {
public long toBytes(long d) {
if (d < 0) {
throw new IllegalArgumentException("Negative size value. Size must be positive: " + d);
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,29 +15,24 @@
* limitations under the License.
*/

package org.apache.linkis.engineplugin.spark.Interpreter

import org.apache.linkis.common.utils.Utils
import org.apache.linkis.engineplugin.spark.common.State
import org.apache.linkis.scheduler.executer.ExecuteResponse

import scala.concurrent.TimeoutException
import scala.concurrent.duration.Duration

/**
*/

trait Interpreter {
def state: State

def execute(code: String): ExecuteResponse

def close(): Unit

@throws(classOf[TimeoutException])
@throws(classOf[InterruptedException])
final def waitForStateChange(oldState: State, atMost: Duration): Unit = {
Utils.waitUntil({ () => state != oldState }, atMost)
package org.apache.linkis.common.utils;

import java.io.Closeable;
import java.io.IOException;

public class CloseIoUtils {

public static void closeAll(Closeable... cs) {
if (cs != null) {
for (Closeable c : cs) {
if (c != null) {
try {
c.close();
} catch (IOException e) {
e.printStackTrace();
}
}
}
}
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -79,20 +79,13 @@ class CustomMonthType(date: String, std: Boolean = true, isEnd: Boolean = false)

def -(months: Int): String = {
val dateFormat = DateTypeUtils.dateFormatLocal.get()
if (std) {
DateTypeUtils.getMonth(std, isEnd, DateUtils.addMonths(dateFormat.parse(date), -months))
} else {
DateTypeUtils.getMonth(std, isEnd, DateUtils.addMonths(dateFormat.parse(date), -months))
}
DateTypeUtils.getMonth(std, isEnd, DateUtils.addMonths(dateFormat.parse(date), -months))

}

def +(months: Int): String = {
val dateFormat = DateTypeUtils.dateFormatLocal.get()
if (std) {
DateTypeUtils.getMonth(std, isEnd, DateUtils.addMonths(dateFormat.parse(date), months))
} else {
DateTypeUtils.getMonth(std, isEnd, DateUtils.addMonths(dateFormat.parse(date), months))
}
DateTypeUtils.getMonth(std, isEnd, DateUtils.addMonths(dateFormat.parse(date), months))
}

override def toString: String = {
Expand All @@ -111,20 +104,14 @@ class CustomMonType(date: String, std: Boolean = true, isEnd: Boolean = false) {

def -(months: Int): String = {
val dateFormat = DateTypeUtils.dateFormatMonLocal.get()
if (std) {
DateTypeUtils.getMon(std, isEnd, DateUtils.addMonths(dateFormat.parse(date), -months))
} else {
DateTypeUtils.getMon(std, isEnd, DateUtils.addMonths(dateFormat.parse(date), -months))
}
DateTypeUtils.getMon(std, isEnd, DateUtils.addMonths(dateFormat.parse(date), -months))

}

def +(months: Int): String = {
val dateFormat = DateTypeUtils.dateFormatMonLocal.get()
if (std) {
DateTypeUtils.getMon(std, isEnd, DateUtils.addMonths(dateFormat.parse(date), months))
} else {
DateTypeUtils.getMon(std, isEnd, DateUtils.addMonths(dateFormat.parse(date), months))
}
DateTypeUtils.getMon(std, isEnd, DateUtils.addMonths(dateFormat.parse(date), months))

}

override def toString: String = {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
package org.apache.linkis.protocol.util;

import java.util.AbstractMap;
import java.util.Objects;

public class ImmutablePair<K, V> {

Expand Down Expand Up @@ -62,4 +63,9 @@ private boolean eq(Object o1, Object o2) {
return false;
}
}

@Override
public int hashCode() {
return Objects.hash(entry);
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@
package org.apache.linkis.rpc.serializer;

import java.util.Map;
import java.util.Objects;
import java.util.concurrent.ConcurrentHashMap;

import scala.Option;
Expand Down Expand Up @@ -49,6 +50,7 @@ public static <T> String serialize(T obj) {
}
Class<T> clazz = (Class<T>) obj.getClass();
Schema<T> schema = getSchema(clazz);
Objects.requireNonNull(schema, "schema must not be null");
byte[] data;
LinkedBuffer buffer = LinkedBuffer.allocate(LinkedBuffer.DEFAULT_BUFFER_SIZE);
try {
Expand All @@ -61,6 +63,7 @@ public static <T> String serialize(T obj) {

public static <T> T deserialize(String str, Class<T> clazz) {
Schema<T> schema = getSchema(clazz);
Objects.requireNonNull(schema, "schema must not be null");
T obj = schema.newMessage();
ProtostuffIOUtil.mergeFrom(toByteArray(str), obj, schema);
return obj;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -37,7 +37,7 @@ public static List<List<String>> getExcelTitle(
} else {
res = XlsxUtils.getBasicInfo(in, file);
}
if (res == null && res.size() < 2) {
if (res == null || res.size() < 2) {
throw new Exception("There is a problem with the file format(文件格式有问题)");
}
List<String> headerType = new ArrayList<>();
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -213,7 +213,9 @@ public boolean copy(String origin, String dest) throws IOException {
setOwner(new FsPath(dest), user, null);
}
} catch (Throwable e) {
file.delete();
if (!file.delete()) {
LOG.error("File deletion failed(文件删除失败)");
}
if (e instanceof IOException) {
throw (IOException) e;
} else {
Expand Down Expand Up @@ -370,14 +372,18 @@ public boolean create(String dest) throws IOException {
if (!isOwner(file.getParent())) {
throw new IOException("you have on permission to create file " + dest);
}
file.createNewFile();
if (!file.createNewFile()) {
LOG.error("File creation failed(文件创建失败)");
}
try {
setPermission(new FsPath(dest), this.getDefaultFilePerm());
if (!user.equals(getOwner(dest))) {
setOwner(new FsPath(dest), user, null);
}
} catch (Throwable e) {
file.delete();
if (!file.delete()) {
LOG.error("File deletion failed(文件删除失败)");
}
if (e instanceof IOException) {
throw (IOException) e;
} else {
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -98,7 +98,6 @@ class StorageExcelWriter(
case TimestampType => style.setDataFormat(format.getFormat("m/d/yy h:mm"))
case DecimalType => style.setDataFormat(format.getFormat("#.000000000"))
case BigDecimalType => style.setDataFormat(format.getFormat("#.000000000"))
case _ => style.setDataFormat(format.getFormat("@"))
}
}
style
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,7 +24,7 @@ public class UJESConstants {
public static final String QUERY_PAGE_SIZE_NAME = "pageSize";
public static final int QUERY_PAGE_SIZE_DEFAULT_VALUE = 100;

public static final Long DRIVER_QUERY_SLEEP_MILLS = 500l;
public static final Long DRIVER_QUERY_SLEEP_MILLS = 500L;
public static final Integer DRIVER_REQUEST_MAX_RETRY_TIME = 3;

public static final String QUERY_STATUS_NAME = "status";
Expand All @@ -40,7 +40,4 @@ public class UJESConstants {
public static final Integer IDX_FOR_LOG_TYPE_ALL = 3; // 0: Error 1: WARN 2:INFO 3: ALL

public static final int DEFAULT_PAGE_SIZE = 500;

public static final String DEFAULT_SPARK_ENGINE = "spark-2.4.3";
public static final String DEFAULT_HIVE_ENGINE = "hive-1.2.1";
}
Original file line number Diff line number Diff line change
Expand Up @@ -182,8 +182,7 @@ public Float getJobProgress() {
return null;
}
if (result instanceof JobInfoResult) {
if (((JobInfoResult) result).getRequestPersistTask() != null
&& ((JobInfoResult) result).getRequestPersistTask() != null) {
if (((JobInfoResult) result).getRequestPersistTask() != null) {
return ((JobInfoResult) result).getRequestPersistTask().getProgress();
}
}
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -24,6 +24,7 @@
import org.apache.linkis.cli.core.exception.BuilderException;
import org.apache.linkis.cli.core.exception.error.CommonErrMsg;
import org.apache.linkis.cli.core.utils.LogUtils;
import org.apache.linkis.common.utils.CloseIoUtils;

import org.apache.commons.lang3.StringUtils;

Expand Down Expand Up @@ -167,20 +168,20 @@ public static String getProxyUser(
}

public static String readFile(String path) {
BufferedReader bufReader = null;
try {
File inputFile = new File(path);

InputStream inputStream = new FileInputStream(inputFile);
InputStreamReader iReader = new InputStreamReader(inputStream);
BufferedReader bufReader = new BufferedReader(iReader);
bufReader = new BufferedReader(iReader);

StringBuilder sb = new StringBuilder();
StringBuilder line;
while (bufReader.ready()) {
line = new StringBuilder(bufReader.readLine());
sb.append(line).append(System.lineSeparator());
}

return sb.toString();

} catch (FileNotFoundException fe) {
Expand All @@ -197,6 +198,8 @@ public static String readFile(String path) {
CommonErrMsg.BuilderBuildErr,
"Cannot read user specified script file: " + path,
e);
} finally {
CloseIoUtils.closeAll(bufReader);
}
}
}
Original file line number Diff line number Diff line change
Expand Up @@ -85,12 +85,12 @@ public void before() {

/* Test different task type */

// "-engineType", "spark-2.4.3",
// "-engineType", "spark-3.2.1",
// "-codeType", "sql",
// "-code", "show tables;show tables;show tables",

//
// "-engineType", "hive-1.2.1",
// "-engineType", "hive-3.1.3",
// "-codeType", "sql",
// "-code", "show tables;",

Expand All @@ -101,11 +101,11 @@ public void before() {
"-code",
"whoami",

// "-engineType", "spark-2.4.3",
// "-engineType", "spark-3.2.1",
// "-codeType", "py",
// "-code", "print ('hello')",

// "-engineType", "spark-2.4.3",
// "-engineType", "spark-3.2.1",
// "-codeType", "scala",
// "-codePath", "src/test/resources/testScala.scala",

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -21,6 +21,8 @@

import org.apache.commons.lang3.StringUtils;

import scala.tools.nsc.doc.model.Object;

/** Data Structure for command Parameter. Command String does not contain the name of Parameter. */
public class Parameter<T> extends BaseOption<T> implements Cloneable {
final String paramName;
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -27,6 +27,7 @@
import java.io.File;
import java.io.FileInputStream;
import java.io.InputStream;
import java.util.Objects;
import java.util.Properties;

import org.slf4j.Logger;
Expand Down Expand Up @@ -73,6 +74,7 @@ public Properties getProperties() {
"PRP0002", ErrorLevel.ERROR, CommonErrMsg.PropsReaderErr, "Source: " + propsPath, e);
} finally {
try {
Objects.requireNonNull(in, "InputStream must not be null");
in.close();
} catch (Exception ignore) {
// ignore
Expand Down
Loading