Double eval "trick" using SharePoint lists - InfoPath Dev

InfoPath Dev

Use our Google Custom Search for best site search results.

Double eval "trick" using SharePoint lists

Last post 11-02-2010 02:03 PM by ocampod. 6 replies.
Page 1 of 1 (7 items)
Sort Posts: Previous Next
  • 11-02-2010 06:33 AM

    Double eval "trick" using SharePoint lists

    Hi everyone, I found a nice guide on how to concatenate email addresses from a repeating group here: Now, I tried to replicate it with a SharePoint list, but I haven't had much luck. I get many errors like "end of string" and "invalid arguments" even though I followed guide, replacing the appropriate references to fields. Has anyone been able to accomplish this? The reason I want to use it is because a user wants to create a distribution list and email that group. I know I can use a SharePoint group for emailing, but a) it's a hassle having to go in and find the person and b) I don't want to provide this user such a permission. Any help would be great! Thanks
  • 11-02-2010 07:00 AM In reply to

    Re: Double eval "trick" using SharePoint lists

    I will give you the example of the one I have and try to explain it the best way I know how.  I had problems with this for weeks until I finally understood it.

    eval(eval(SharePointListItem_RW[(reportsto = Emp_Pos_Nbr) and (emplid != "")], 'concat("POS#-", d:posnbr, "_EID#-", d:emplid, "-", ";")'), "..")

    eval(eval(Repeating Group (not the field you want) [Filtered item], (Concat only uses the ' and not " ) 'concat(notice here it is d: or m: this coincides with the repeating group you are using.  If the repeating group is in your form you will use a my:fieldnameinthegroup and if it is from a datasource you use a d:fieldnameinthegroup)'), "..")

    As you can see I have two of the fields in the row where the filter has selected from the repeating group, and seperating them with a string:  POS#-FIELD_POSNBR_EID#-FIELD_EMPLID-; Repeat for next row

    Hope this helps, if you have problems post your string



  • 11-02-2010 10:59 AM In reply to

    Re: Double eval "trick" using SharePoint lists

    Barry, Thanks for following up! Here's my function: eval(eval(Refunds_Team, 'concat(d:@Email_Address,";")'), "..") The "Refunds_Team" string is the SharePoint list and @Email_Address is the field I am trying to concatenate. This is the error I am getting: msxml5.dll Expected token ')' found ':'. concat(d-->:
  • 11-02-2010 11:15 AM In reply to

    Re: Double eval "trick" using SharePoint lists


    I think your problem is in the Refunds_Team, you should use the Insert Field or Group, then select your datasource, then open dataFields and select d:SharePointListItem_RW then on concat should be d:Email_Address

    I would envision it looking like eval(eval(SharePointListItem_RW, 'concat(d:Email_Address,";")'), "..")


  • 11-02-2010 11:32 AM In reply to

    Re: Double eval "trick" using SharePoint lists

    Barry, I followed your steps but I get this error:

    Reference to undeclared namespace prefix: 'd'.

    Error occurred during a call to property or method 'Eval'

    For the "Refunds_Team", I'm selecting the repeating group within my secondary data source. Can't I use the XPath to refer to "Email_Address"?
  • 11-02-2010 11:42 AM In reply to

    Re: Double eval "trick" using SharePoint lists

    Not sure, here is my xpath statement for you to compare:


    xdMath:Eval(xdMath:Eval(xdXDocument:GetDOM("Retrieval Position")/dfs:myFields/dfs:dataFields/d:SharePointListItem_RW[(d:reportsto = xdXDocument:get-DOM()/my:myFields/my:Position/my:Emp_Pos_Nbr) and (d:emplid != "")], 'concat("POS#-", d:posnbr, "_EID#-", d:emplid, "-", ";")'), "..")


  • 11-02-2010 02:03 PM In reply to

    Re: Double eval "trick" using SharePoint lists


    That did the trick!

    Here's my XPath:

    xdMath:Eval(xdMath:Eval(xdXDocument:GetDOM("Refunds Team")/dfs:myFields/dfs:dataFields/dfs:Refunds_Team, 'concat(@E-mail_Address, ";")'), "..")

    Thanks for helping me out!

Page 1 of 1 (7 items)
Copyright © 2003-2019 Qdabra Software. All rights reserved.
View our Terms of Use.