-
Notifications
You must be signed in to change notification settings - Fork 2.2k
Closed
Labels
Description
Search before asking
- I had searched in the issues and found no similar issues.
What happened
There are 1 million pieces of data in the UserTable table of my DynamoDB. I use the Amazon Dynamodb-Source connector to read, and I can only read 93 pieces of data before the end, which may be a bug
Job Statistic Information
Start Time : 2023-07-20 08:59:27
End Time : 2023-07-20 08:59:31
Total Time(s) : 3
Total Read Count : 93
Total Write Count : 93
Total Failed Count : 0
SeaTunnel Version
2.3.2
SeaTunnel Config
env {
execution.parallelism = 1
job.mode = "BATCH"
}
source {
Amazondynamodb {
url = "https://dynamodb.us-east-1.amazonaws.com"
region = "us-east-1"
access_key_id = ""
secret_access_key = ""
table = "UserTable"
schema = {
fields {
userID = string
}
}
result_table_name = "source_table"
}
}
transform {
FieldMapper {
source_table_name = "source_table"
result_table_name = "transform_table"
field_mapper = {
userID=user_id
}
}
}
sink {
Console {
source_table_name = "transform_table"
}
}
Running Command
../bin/seatunnel.sh -e local --config ddb.templateError Exception
The data volume of the table is 100w, and only 93 data items are read
2023-07-20 08:59:31,452 INFO org.apache.seatunnel.core.starter.seatunnel.command.ClientExecuteCommand -
***********************************************
Job Statistic Information
***********************************************
Start Time : 2023-07-20 08:59:27
End Time : 2023-07-20 08:59:31
Total Time(s) : 3
Total Read Count : 93
Total Write Count : 93
Total Failed Count : 0
***********************************************
Flink or Spark Version
No response
Java or Scala Version
No response
Screenshots
No response
Are you willing to submit PR?
- Yes I am willing to submit a PR!
Code of Conduct
- I agree to follow this project's Code of Conduct