-
Notifications
You must be signed in to change notification settings - Fork 972
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
[Enhancement #480] add code style framework #488
Conversation
…nation (#261) * [feature] flink k8s native mode support * [feature] flink k8s native mode support * [issue#220] refactoring SubmitRequest, SubmitResponse to adapt k8s submit operations * [issue#220] refactoring SubmitRequest, SubmitResponse to adapt k8s submit operations * [issue#220] New dto object for flink stop action parameter transfer process * [issue#220] refactor: move the parameters of the flink stop method to a dedicated dto object * modify configuration constants of workspace(#251) * typo(#251) * add isAnyBank method(#251) * add unified fs operator defined(#251) * register FsOperator to SpringBoot Bean(#251) * remove unnecessary import(#251) * extend the signature of method upload, copy, copyDir(#251) * Separate workspace storage type into configuration(#251) * Separate workspace storage type into configuration(#251) * add fileMd5 method(#251) * replace the code reference of HdfsUtils to FsOperator(#251) * change the bean injection behavior of FsOperator(#251) * change the config key of streamx.workspace(#251) * fix stack overflow bug * LfsOperator.upload support dir source * Update ConfigConst.scala * Update HdfsOperator.scala * Update LfsOperator.scala * Update UnifiledFsOperator.scala * Update Utils.scala * compatible with flink k8s submit * compatible with flink k8s submit Co-authored-by: benjobs <benjobs@qq.com>
[feature] Support for submit/cancel Flink-SQL-Job on Flink-K8s-native session/application mode. [feature] Support for tracking Flink job status and metrics information from Flink-K8s cluster. [feature] Support for StreamX instance to manage both Flink-Yarn or Flink-K8s runtime Cluster, and to use either
hi @wuchunfu : Thanks for your contribution. It seems there are 2 main problems with this PR:
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
LGTM.
@@ -100,15 +100,15 @@ object ConfigConst { | |||
val KEY_SPARK_BATCH_DURATION = "spark.batch.duration" | |||
|
|||
// flink | |||
def KEY_APP_CONF(prefix: String = null): String = if (prefix == null) "conf" else s"${prefix}conf" | |||
def keyAppConf(prefix: String = null): String = if (prefix == null) "conf" else s"${prefix}conf" |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is actually a constant, like "KEY_SPARK_BATCH_DURATION", sometimes need prefix, sometimes without.
这个的作用也是和上面的常量一样.只不过说有时候使用需要前缀"--",有时候不需要,本质上讲还是个常量, 这个地方不建议更改. 所有 KEY打头的都是项目里会用到的一些列参数的key.
@@ -48,13 +48,13 @@ object ConfigUtils { | |||
param.foreach(x => kafkaProperty.put(x._1, x._2.trim)) | |||
val _topic = topic match { | |||
case SIGN_EMPTY => | |||
val top = kafkaProperty.getOrElse(KEY_KAFKA_TOPIC, null) | |||
val top = kafkaProperty.asScala.getOrElse(KEY_KAFKA_TOPIC, null) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
optimization is not recommended here, which is a feature of scala syntax.
这里使用到了隐式转换的方式.会自动转型,是scala的一个高级语法特性,不建议优化.
@@ -23,7 +23,7 @@ package com.streamxhub.streamx.common.util | |||
|
|||
import com.streamxhub.streamx.common.conf.ConfigConst._ | |||
import org.apache.commons.collections.CollectionUtils | |||
import org.apache.commons.lang.StringUtils | |||
import org.apache.commons.lang3.StringUtils |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
good job.
@@ -227,7 +227,9 @@ object HadoopUtils extends Logger { | |||
Option(reusableHdfs).getOrElse { | |||
reusableHdfs = Try { | |||
ugi.doAs[FileSystem](new PrivilegedAction[FileSystem]() { | |||
// scalastyle:off FileSystemGet |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this comment can be removed
@@ -187,18 +187,19 @@ object SqlSplitter { | |||
* @return | |||
*/ | |||
private[streamx] def isSingleLineComment(curChar: Char, nextChar: Char): Boolean = { | |||
var flag = false |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
changing the value of a variable is not recommended in scala.
scala推荐函数式编程,一般申明变量都是val, 不推荐后期的操作去更改变量的值, 除非不得已的时候.
@@ -64,7 +65,7 @@ | |||
|
|||
private String flinkConfTemplate = null; | |||
|
|||
private String PROD_ENV_NAME = "prod"; | |||
private String prodEnvName = "prod"; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
private final String PROD_ENV_NAME is better.
@@ -1201,7 +1247,7 @@ public boolean start(Application appParam, boolean auto) throws Exception { | |||
} else { | |||
FlinkSql flinkSql = flinkSqlService.getEffective(application.getId(), false); | |||
jarPackDeps = Application.Dependency.jsonToDependency(flinkSql.getDependency()).toJarPackDeps(); | |||
optionMap.put(ConfigConst.KEY_FLINK_SQL(null), flinkSql.getSql()); | |||
optionMap.put(ConfigConst.keyFlinkSql(null), flinkSql.getSql()); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
It is not recommended to modify here
@@ -110,8 +160,7 @@ | |||
new ThreadPoolExecutor.AbortPolicy() | |||
); | |||
|
|||
private final Pattern JOBNAME_PATTERN = Pattern.compile("^[.\\x{4e00}-\\x{9fa5}A-Za-z0-9_—-]+$"); | |||
|
|||
private final Pattern jobNamePattern = Pattern.compile("^[.\\x{4e00}-\\x{9fa5}A-Za-z0-9_—-]+$"); |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is a constant...
@@ -107,21 +116,21 @@ | |||
@Autowired | |||
private AlertService alertService; | |||
|
|||
private static ApplicationService applicationService; | |||
private static ApplicationService APPLICATION_SERVICE; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
this is not a constant...
|
||
private static final Map<String, Long> CHECK_POINT_MAP = new ConcurrentHashMap<>(); | ||
|
||
private static final Map<String, Counter> CHECK_POINT_FAILED_MAP = new ConcurrentHashMap<>(); | ||
|
||
private static final Map<Long, OptionState> OPTIONING = new ConcurrentHashMap<>(); | ||
|
||
private final Long STARTING_INTERVAL = 1000L * 30; | ||
private final Long startingInterval = 1000L * 30; |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
STARTING_INTERVAL is better.
What problem does this PR solve?
Issue Number: close #480
Problem Summary:
add code style framework