Jackie-Jiang commented on code in PR #8503:
URL: https://github.com/apache/pinot/pull/8503#discussion_r847842149


##########
pinot-core/src/main/java/org/apache/pinot/core/common/BlockValSet.java:
##########
@@ -29,6 +29,13 @@
  */
 public interface BlockValSet {
 
+  /**
+   * Returns the number of entries for a single-valued column.
+   *
+   * @return number of single-valued entries
+   */
+  int getNumSVEntries();

Review Comment:
   I don't think we need this API. If it is really required, let's rename it to 
`getNumDocs()`



##########
pinot-clients/pinot-jdbc-client/src/main/java/org/apache/pinot/client/utils/DriverUtils.java:
##########
@@ -134,6 +135,10 @@ public static String getJavaClassName(String 
columnDataType) {
       case "DOUBLE":
         columnsJavaClassName = Double.class.getTypeName();
         break;
+      case "BIG_DECIMAL":

Review Comment:
   Should we add this to `getSQLDataType()` as well? Is it possible to be 
`BIGDECIMAL` if the value is from the `ResultSet`?



##########
pinot-core/src/main/java/org/apache/pinot/core/common/DataBlockCache.java:
##########
@@ -255,9 +255,10 @@ public String[] getStringValuesForSVColumn(String column) {
    * @param column Column name
    * @param evaluator transform evaluator
    * @param buffer values to fill
+   * @param parseExactBigDecimal parse exact BigDecimal values
    */
-  public void fillValues(String column, TransformEvaluator evaluator, String[] 
buffer) {
-    _dataFetcher.fetchStringValues(column, evaluator, _docIds, _length, 
buffer);
+  public void fillValues(String column, TransformEvaluator evaluator, String[] 
buffer, boolean parseExactBigDecimal) {

Review Comment:
   We don't need to parse big decimal in the physical data access layer. It 
should be read as `byte[]`. It is not feasible to add physical data access for 
every individual logical types.



##########
pinot-core/src/main/java/org/apache/pinot/core/operator/blocks/IntermediateResultsBlock.java:
##########
@@ -328,14 +328,14 @@ public DataTable getDataTable()
   private DataTable getResultDataTable()
       throws IOException {
     DataTableBuilder dataTableBuilder = new DataTableBuilder(_dataSchema);
-    ColumnDataType[] storedColumnDataTypes = 
_dataSchema.getStoredColumnDataTypes();
+    ColumnDataType[] columnDataTypes = _dataSchema.getColumnDataTypes();

Review Comment:
   We don't need to change this. `BIG_DECIMAL` will be handled as `BYTES`



##########
pinot-core/src/main/java/org/apache/pinot/core/operator/transform/function/BaseTransformFunction.java:
##########
@@ -219,13 +250,74 @@ public double[] transformToDoubleValuesSV(ProjectionBlock 
projectionBlock) {
           String[] stringValues = transformToStringValuesSV(projectionBlock);
           ArrayCopyUtils.copy(stringValues, _doubleValuesSV, length);
           break;
+        case BYTES:
+          if (getResultMetadata().getDataType().equals(DataType.BIG_DECIMAL)) {
+            BigDecimal[] bigDecimalValues = 
transformToBigDecimalValuesSV(projectionBlock);
+            ArrayCopyUtils.copy(bigDecimalValues, _doubleValuesSV, length);
+            break;
+          }
+          throw new IllegalStateException();
         default:
           throw new IllegalStateException();
       }
     }
     return _doubleValuesSV;
   }
 
+  // Note 1: Why need transformToBigDecimalValuesSV(projectionBlock) while 
there is no one for TIMESTAMP data type?
+  // This is because TIMESTAMP's underlying stored type is LONG and we already 
have a method for LONG that can convert
+  // to other numeric data types (check: 
transformToLongValuesSV(projectionBlock) method).
+  // However, BIG_DECIMAL's underlying stored type is BYTES and 
transformToBytesValuesSV(projectionBlock) handles only
+  // the conversion from STRING (check: 
transformToBytesValuesSV(projectionBlock) method).
+  //
+  // Note 2: An alternative way of transforming projectionBlock into 
BigDecimal[] values is to call
+  // transformToBytesValuesSV(projectionBlock) method that return byte[] 
values and then transform byte[] to
+  // BigDecimal[]. However, this requires three operations (vs. two in this 
method):
+  // 1- transformTo[Type]ValuesSV(projectionBlock) based on the underlying 
stored type.
+  // 2- ArrayCopyUtils.copy([Type]Values, bytesValues, length).
+  // 3- ArrayCopyUtils.copy(bytesValues, bigDecimals, length).
+  @Override
+  public BigDecimal[] transformToBigDecimalValuesSV(ProjectionBlock 
projectionBlock) {

Review Comment:
   Let's not add this method for now. We might not want to support implicit 
cast to/from big decimal to other data types (this is not the SQL semantic as 
well). Users can cast explicitly using transform function



-- 
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.

To unsubscribe, e-mail: commits-unsubscr...@pinot.apache.org

For queries about this service, please contact Infrastructure at:
us...@infra.apache.org


---------------------------------------------------------------------
To unsubscribe, e-mail: commits-unsubscr...@pinot.apache.org
For additional commands, e-mail: commits-h...@pinot.apache.org

Reply via email to