Settings and activity
53 results found
-
5 votes
Leeann Wright supported this idea ·
-
9 votes
Leeann Wright supported this idea ·
-
15 votes
Leeann Wright supported this idea ·
-
8 votesProposed Idea · 1 comment · CommunitySuite Idea Lab » Giving Hub (formerly Donation Portal) · Admin →
Leeann Wright supported this idea ·
-
2 votes
Leeann Wright supported this idea ·
-
6 votes
Leeann Wright supported this idea ·
-
2 votes
Leeann Wright shared this idea ·
-
13 votes
Leeann Wright supported this idea ·
-
100 votes
Leeann Wright supported this idea ·
-
80 votes
Leeann Wright supported this idea ·
-
102 votes
Leeann Wright supported this idea ·
-
95 votes
Leeann Wright supported this idea ·
-
234 votes
Leeann Wright supported this idea ·
-
11 votes
An error occurred while saving the comment Leeann Wright supported this idea ·
-
7 votes
Leeann Wright shared this idea ·
-
70 votes
Thank you for the idea.
Would you hope this is implemented across all processes?
Leeann Wright supported this idea ·
-
31 votes
Leeann Wright supported this idea ·
-
74 votes
Leeann Wright supported this idea ·
-
61 votes
Leeann Wright supported this idea ·
-
224 votes
Leeann Wright supported this idea ·
My foundation allows use of AI to generate ideas, or to reivew their work, such as Grammarly. We require a disclosure from the student if they used AI and a description of how it was used. However, we are seeing a steep rise in students submitting AI generated work and trying to pass it off as their own. This is not acceptable as it is academic misconduct. It would be helpful to have a screening tool to help identify AI generated content, as we want to ensure students are acting with academic integrity.