At 15 separate days Alice spends 1 minute in close proximity of Carol, who is infected with COVID-19. Bob on the other hand spends 15 minutes on one day with Carol, who is infected with COVID-19. Assuming the infectiousness of Carol is the same on every day and everything else about the risk factors is constant, is Bob more likely to get infected then Alice? If so, by how much?
Bob is more likely to develop COVID-19, as he is getting 15x the initial viral load (given that this is a thought experiment and we are assuming "all else equal" - in reality, of course, all else is never equal).
This is because COVID is primarily spread via exhalation droplets, and people breath several times per minute, so it can be usefully modeled as a smooth virus particles encountered / second, and cleared from body / hour, with a (variable by subject, and not precise) threshold of exposure leading to infection. This is in contrast to something like "standing by the target at an archery range" where it doesn't matter if you are there for 15 minutes in a row, or 15 separate minutes, you have the same chance of getting hit by an arrow (and arrow hits are huge chunks of damage, not a smooth gradient with time based mediation). Contrast also with something like cooking, where you are more likely to burn yourself cooking for 1 minute on 15 days than 15 minutes on one day.