-
Notifications
You must be signed in to change notification settings - Fork 52
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
assertCSVDataSet fails when csv dataset to assert starts with all zeros #451
Comments
Looks like the problem is here Classes/Core/Functional/Framework/DataHandling/DataSet.php line 108
I would love to provide a pull request with a fix, but I'm not sure why this check is there in the first place. |
I guess the check is there, as the first value/column should be the primery key value/autoincrement value of the table (and therefore So simply removing that empty check is a quick workaround, but not really reliable - and leading to false checks quite quickly. So you may implement ann assert then more on yourself with a query:
Would vote against removing that check. Eventually it should be more combined to check for the real PK/field etc ... (which is quite a bigger change than). |
@sbuerk Maybe an exception would be nice for the user to see why the test is failing. Currently it's kind of failing "silently" giving the message Not asserted record found for ... although you see the record and the fields are all matching. |
A related second issue has reported with #488. |
I don't follow that argument because
Let's please fix this and allow Addendum: From an architectural standpoint a parser is not the place to check data integrity. It should care about syntax, not semantics. The latter can be checked with post parse validation (if necessary). A parser that drops syntactically correct data is not doing a good job. |
Ack. (And was already ack, just not commented it). Will look into it today, I guess. |
Thanks a lot! |
I don't agree: In functional tests we do and should know the result uid's since the entire scenario is under functional test control. Skipping the auto increment is not a good idea and makes asserting those harder than they should be. Note asserting non-uid tables works well: Core has many scenarios with sys_refindex (where primary key is a hash), and also with MM rows where primary key is a combination of multiple fields. |
@lolli42 You woudn't loose the option to specify uids, but given the possibility to skip them which would be a nice time-saver. I don't see any reason to force people to know their uids in a sut. |
@lolli42 Ok, so what I get from the argumentation is:
A parser that silently fails on syntactically correct data seems to be just alright. One more patch to my list of composer patches for each project then. |
Everybody, please let's revisit this. Independent of how much sense the use cases makes, the current code is incorrect (related to what it should do). It should only catch empty string, but not falsey vales. |
... I'll rediscuss with stefan. One reason is that having the uid in assert-csv makes it very easy for the asserter to locate the rows and to sort out if all rows have been handled, which is usually what we want: There shouldn't be neither more, nor less rows than what is stated in the assert-csv. That's the deal. Ditching the uid restriction makes it potentially harder and will need a more serious look at the asserter. |
I'd suggest to keep the "is there a UID?" check, but change it from |
This is solved with #488 being done, is it? |
Solved with #488, confirmed by opener. |
What are you trying to achieve?
Successfully run the assertCSVDataSet function on data that was inserted into the db.
What do you get instead?
assertCSVDataSet fails on running the Test with the message "Not asserted record found for ..."
How to reproduce the issue?
"my_ext_table",,,
,"deleted","description","some_field"
,0,"Foo","A"
,0,"Bar","B"
,0,"Test","C"
"my_ext_table",,,
,"description","some_field","deleted"
,"Foo","A",0
,"Bar","B",0
,"Test","C",0
Specify some data of the environment
Related to #488
The text was updated successfully, but these errors were encountered: