We have a web application which uses ESAPI 2.1.0 jar to validate URL and query strings (eg. http://ift.tt/1GJAu6ialert("hacking") request. We have implemented a new filter class Implementing the Base Filter to achieve the same in the the doFilter method.
We are seeing a major performance issue for the same. Used Code below
Example 1:
StringBuffer requestUrl = httpServletRequest.getRequestURL();
String queryString = httpServletRequest.getQueryString();
if (StringUtils.isNotBlank(queryString)) {
requestUrl.append('?').append(queryString);
}
String url = requestUrl.toString();
boolean isSafe = ESAPI.validator().isValidInput("URL", url, "URL", url.length(), true, false);
if (!isSafe) {
logger.error("uNSAFE : URL contain scripting elements .sOMEONE TRYING TO MANIPULATE THE url");
validation.properties:
Validator.URL=^(ht|f)tp(s?)\\:\\/\\/[0-9a-zA-Z]([-.\\w]*[0-9a-zA-Z])*(:(0- 9)*)*(\\/?)([a-zA-Z0-9\\-\\.\\?\\,\\:\\'\\/\\\\\\+=&%\\$#_]*)?$
Example 2: (Tried the below as well) Created a simple program with main function
List<String> myList = myset(); (the list size is 700 which holds valid and malformed/invalid url)
System.out.println("Total Size of URLS" +myList.size());
for (int i = 0; i < myList.size(); i++) {
System.out.println("Item No" +i);
boolean isSafe = ESAPI.validator().isValidInput(
"URL", validation(myList.get(i)), "URL", url1.length(),false,false);
}
Can you please help to understand the performance impact we are having on using the above logic? (example 1 and example 2) And also we observed the performance is drastically slow for UNSAFE URLS.
Aucun commentaire:
Enregistrer un commentaire