????streamload??????????????abort reason: coordinate BE is down

2022-06-29 Thread james
??


     FE??3??BE 


     ??streamload 
??
BE ?? ??


W0625 21:11:04.788328 19871 stream_load_executor.cpp:281] commit transaction 
failed, errmsg=errCode = 2, detailMessage = transaction [1638] is already 
aborted. abort reason: coordinate BE is down, 
id=d14f60f37fe29182-e2452ee346c991b8, job_id=-1, txn_id=1638, 
label=xxx_t_3e9877180e1f415686443124a9fb72b0, elapse(s)=34
W0625 21:11:04.788368 19871 stream_load.cpp:144] handle streaming load failed, 
id=d14f60f37fe29182-e2452ee346c991b8, errmsg=errCode = 2, detailMessage = 
transaction [1638] is already aborted. abort reason: coordinate BE is down
W0625 21:11:04.796619 19890 stream_load_executor.cpp:281] commit transaction 
failed, errmsg=errCode = 2, detailMessage = transaction [1640] is already 
aborted. abort reason: coordinate BE is down, 
id=bb4ac386050f365f-5697ff9a7614929d, job_id=-1, txn_id=1640, 
label=xxx_t_30bde0223a4c49bfae8855ea510e5157, elapse(s)=34
W0625 21:11:04.796648 19890 stream_load.cpp:144] handle streaming load failed, 
id=bb4ac386050f365f-5697ff9a7614929d, errmsg=errCode = 2, detailMessage = 
transaction [1640] is already aborted. abort reason: coordinate BE is down
W0625 21:11:04.796726 19872 stream_load_executor.cpp:281] commit transaction 
failed, errmsg=errCode = 2, detailMessage = transaction [1632] is already 
aborted. abort reason: coordinate BE is down, 
id=3e4bc3cd3e3e853f-cf8604ac349eb1b5, job_id=-1, txn_id=1632, 
label=xxx_t_a20283afde854b54b362b91141571445, elapse(s)=34
W0625 21:11:04.796748 19894 stream_load_executor.cpp:281] commit transaction 
failed, errmsg=errCode = 2, detailMessage = transaction [1635] is already 
aborted. abort reason: coordinate BE is down, 
id=0c4a7e043b34eb88-9d27887e66e1e392, job_id=-1, txn_id=1635, 
label=xxx_t_638ea9d5da994462b56674f4218aa95d, elapse(s)=34
W0625 21:11:04.796767 19872 stream_load.cpp:144] handle streaming load failed, 
id=3e4bc3cd3e3e853f-cf8604ac349eb1b5, errmsg=errCode = 2, detailMessage = 
transaction [1632] is already aborted. abort reason: coordinate BE is down
W0625 21:11:04.796799 19894 stream_load.cpp:144] handle streaming load failed, 
id=0c4a7e043b34eb88-9d27887e66e1e392, errmsg=errCode = 2, detailMessage = 
transaction [1635] is already aborted. abort reason: coordinate BE is down
W0626 21:07:20.208528 19894 stream_load_executor.cpp:281] commit transaction 
failed, errmsg=errCode = 2, detailMessage = transaction [1834] is already 
aborted. abort reason: coordinate BE is down, 
id=6c4fc4d91eead9d1-c7bd542607b427a5, job_id=-1, txn_id=1834, 
label=xxx_t_a9c563f5eb2f45ee8a299889704ee42e, elapse(s)=33
W0626 21:07:20.208611 19894 stream_load.cpp:144] handle streaming load failed, 
id=6c4fc4d91eead9d1-c7bd542607b427a5, errmsg=errCode = 2, detailMessage = 
transaction [1834] is already aborted. abort reason: coordinate BE is down

Spark????Doris 1.0???????????????? Unrecognized field "keysType"

2022-06-29 Thread james
??


      ??doris 1.0    1??FE,3??BE.


      sparkdoris






org.apache.doris.spark.exception.DorisException: Doris FE's response cannot map 
to schema. res: 
{"keysType":"DUP_KEYS","properties":[{"name":"id","aggregation_type":"","comment":"
 id","type":"BIGINT"},{"name":"name","aggregation_type":"NONE","comment":" 
","type":"VARCHAR"}],"status":200}
 at 
org.apache.doris.spark.rest.RestService.parseSchema(RestService.java:303)
 at org.apache.doris.spark.rest.RestService.getSchema(RestService.java:279)
 at 
org.apache.doris.spark.sql.SchemaUtils$.discoverSchemaFromFe(SchemaUtils.scala:51)
 at 
org.apache.doris.spark.sql.SchemaUtils$.discoverSchema(SchemaUtils.scala:41)
 at 
org.apache.doris.spark.sql.DorisRelation.lazySchema$lzycompute(DorisRelation.scala:48)
 at 
org.apache.doris.spark.sql.DorisRelation.lazySchema(DorisRelation.scala:48)
 at org.apache.doris.spark.sql.DorisRelation.schema(DorisRelation.scala:52)
 at 
org.apache.spark.sql.execution.datasources.DataSource.resolveRelation(DataSource.scala:402)
 at 
org.apache.spark.sql.DataFrameReader.loadV1Source(DataFrameReader.scala:223)
 at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:211)
 at org.apache.spark.sql.DataFrameReader.load(DataFrameReader.scala:167)
 at 
com.vxdata.datacenter.dataquality.util.SparkSqlUtil$.loadTable(SparkSqlUtil.scala:178)
 at com.vxdata.buildmodel.computer.input.DBInput.compute(DBInput.scala:11)
 at 
com.vxdata.buildmodel.computer.service.JobFlowService.invokeCompute(JobFlowService.scala:239)
 at 
com.vxdata.buildmodel.computer.service.JobFlowService$$anonfun$exec$1.apply(JobFlowService.scala:81)
 at 
com.vxdata.buildmodel.computer.service.JobFlowService$$anonfun$exec$1.apply(JobFlowService.scala:78)
 at 
scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:59)
 at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:48)
 at 
com.vxdata.buildmodel.computer.service.JobFlowService.exec(JobFlowService.scala:78)
 at 
com.vxdata.buildmodel.computer.service.JobFlowService.exec(JobFlowService.scala:34)
 at 
com.vxdata.etl.SparkETLMain$$anonfun$1.apply$mcV$sp(SparkETLMain.scala:64)
 at 
com.vxdata.datacenter.dataquality.common.TApplication$class.start(TApplication.scala:42)
 at com.vxdata.etl.SparkETLMain$.start(SparkETLMain.scala:12)
 at 
com.vxdata.etl.SparkETLMain$.delayedEndpoint$com$vxdata$etl$SparkETLMain$1(SparkETLMain.scala:20)
 at 
com.vxdata.etl.SparkETLMain$delayedInit$body.apply(SparkETLMain.scala:12)
 at scala.Function0$class.apply$mcV$sp(Function0.scala:34)
 at 
scala.runtime.AbstractFunction0.apply$mcV$sp(AbstractFunction0.scala:12)
 at scala.App$$anonfun$main$1.apply(App.scala:76)
 at scala.App$$anonfun$main$1.apply(App.scala:76)
 at scala.collection.immutable.List.foreach(List.scala:392)
 at 
scala.collection.generic.TraversableForwarder$class.foreach(TraversableForwarder.scala:35)
 at scala.App$class.main(App.scala:76)
 at com.vxdata.etl.SparkETLMain$.main(SparkETLMain.scala:12)
 at com.vxdata.etl.SparkETLMain.main(SparkETLMain.scala)
 at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
 at 
sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
 at 
sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
 at java.lang.reflect.Method.invoke(Method.java:498)
 at 
org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:673)
 Caused by: org.codehaus.jackson.map.exc.UnrecognizedPropertyException: 
Unrecognized field "keysType" (Class 
org.apache.doris.spark.rest.models.Schema), not marked as ignorable
 at [Source: java.io.StringReader@7a1fa00e; line: 1, column: 14] (through 
reference chain: org.apache.doris.spark.rest.models.Schema["keysType"])
 at 
org.codehaus.jackson.map.exc.UnrecognizedPropertyException.from(UnrecognizedPropertyException.java:53)
 at 
org.codehaus.jackson.map.deser.StdDeserializationContext.unknownFieldException(StdDeserializationContext.java:267)
 at 
org.codehaus.jackson.map.deser.std.StdDeserializer.reportUnknownProperty(StdDeserializer.java:673)
 at 
org.codehaus.jackson.map.deser.std.StdDeserializer.handleUnknownProperty(StdDeserializer.java:659)
 at 
org.codehaus.jackson.map.deser.BeanDeserializer.handleUnknownProperty(BeanDeserializer.java:1365)
 at 
org.codehaus.jackson.map.deser.BeanDeserializer._handleUnknown(BeanDeserializer.java:725)
 at 
org.codehaus.jackson.map.deser.BeanDeserializer.deserializeFromObject(BeanDeserializer.java:703)
 at 
org.codehaus.jackson.map.deser.BeanDeserializer.deserialize(BeanDeserializer.java:580)
 at 
org.codehaus.jackson.map.ObjectMapper._readMapAndClose(ObjectMapper.java:2732)
 at org.codehaus.jackson.map.ObjectMapper.readValue(ObjectMapper.java:1863)
 at 
org.apache.doris.spark.rest.RestService.parseSchema(RestService.java:295) ... 
38 more ;

?????? ????streamload??????????????abort reason: coordinate BE is down

2022-06-29 Thread james
??BEload??BE??




--  --
??: 
   "dev"