Add tests for remote inference with SingleLabelClassifier
AlanAboudib opened this issue · comments
Alan Aboudib commented
Description
A clear and concise description of what you want to test.
Type of Test
- Unit test (e.g. checking a loop, method, or function is working as intended)
- Integration test (e.g. checking if a certain group or set of functionality is working as intended)
- Regression test (e.g. checking if by adding or removing a module of code allows other systems to continue to function as intended)
- Stress test (e.g. checking to see how well a system performs under various situations, including heavy usage)
- Performance test (e.g. checking to see how efficient a system is as performing the intended task)
- Other...
Expected Behavior
A clear and concise description of what you expected to happen. Do you intend to reach a certain percentage of test coverage for this file, feature, or codebase?
Additional Context
Add any other context about the tests here.
j-chim commented
Hi, new mentee here. I recently completed the udacity course and the series of OpenMined related tutorials, and am hoping to start working on my first issue.
Is this still available? If it is I'd love to work on it. Thanks!
Jatin Prakash commented
sure go ahead @j-chim 👍
j-chim commented
thanks @bicycleman15. if I have questions can I ping you on slack?
Jatin Prakash commented
sure. its @jatin
on slack.